[ 507.176407] env[68521]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 507.891069] env[68571]: Modules with known eventlet monkey patching issues were imported prior to eventlet monkey patching: urllib3. This warning can usually be ignored if the caller is only importing and not executing nova code. [ 509.271528] env[68571]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'linux_bridge' {{(pid=68571) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 509.271937] env[68571]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'noop' {{(pid=68571) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 509.271987] env[68571]: DEBUG os_vif [-] Loaded VIF plugin class '' with name 'ovs' {{(pid=68571) initialize /opt/stack/data/venv/lib/python3.10/site-packages/os_vif/__init__.py:44}} [ 509.272297] env[68571]: INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs [ 509.468371] env[68571]: DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm {{(pid=68571) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:384}} [ 509.479149] env[68571]: DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.011s {{(pid=68571) execute /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/processutils.py:422}} [ 509.590999] env[68571]: INFO nova.virt.driver [None req-bb09cf4e-1adc-4aad-a71b-7e6aede2d16c None None] Loading compute driver 'vmwareapi.VMwareVCDriver' [ 509.669408] env[68571]: DEBUG oslo_concurrency.lockutils [-] Acquiring lock "oslo_vmware_api_lock" by "oslo_vmware.api.VMwareAPISession._create_session" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 509.669408] env[68571]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" acquired by "oslo_vmware.api.VMwareAPISession._create_session" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 509.669408] env[68571]: DEBUG oslo_vmware.service [-] Creating suds client with soap_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk' and wsdl_url='https://vc1.osci.c.eu-de-1.cloud.sap:443/sdk/vimService.wsdl' {{(pid=68571) __init__ /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:242}} [ 512.543810] env[68571]: DEBUG oslo_vmware.service [-] Invoking ServiceInstance.RetrieveServiceContent with opID=oslo.vmware-c452390e-beac-48d8-8a56-8373976d6c18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.560678] env[68571]: DEBUG oslo_vmware.api [-] Logging into host: vc1.osci.c.eu-de-1.cloud.sap. {{(pid=68571) _create_session /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:242}} [ 512.560833] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.Login with opID=oslo.vmware-573a0fff-f389-42bb-81d5-0b1fb41477fe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.585015] env[68571]: INFO oslo_vmware.api [-] Successfully established new session; session ID is 54de7. [ 512.585144] env[68571]: DEBUG oslo_concurrency.lockutils [-] Lock "oslo_vmware_api_lock" "released" by "oslo_vmware.api.VMwareAPISession._create_session" :: held 2.917s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 512.585664] env[68571]: INFO nova.virt.vmwareapi.driver [None req-bb09cf4e-1adc-4aad-a71b-7e6aede2d16c None None] VMware vCenter version: 7.0.3 [ 512.589208] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81f2a0d6-d35b-40cc-9d50-e95e3d94e0d7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.606250] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd63b148-b191-4901-8838-6491a2b93656 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.611904] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76403ebc-12a9-4717-aca1-94d4da04f2e8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.618394] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08668ed5-46e8-4f3b-a3ed-a98c07ec8043 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.631694] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-758a5a7f-ca49-43a8-b466-2b96d0d13972 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.637411] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f73c903-cc86-4b02-b9a9-8b5e2da2e219 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.668968] env[68571]: DEBUG oslo_vmware.service [-] Invoking ExtensionManager.FindExtension with opID=oslo.vmware-36d01e7a-15ec-4e4e-a4f7-e92b3e23000a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 512.674330] env[68571]: DEBUG nova.virt.vmwareapi.driver [None req-bb09cf4e-1adc-4aad-a71b-7e6aede2d16c None None] Extension org.openstack.compute already exists. {{(pid=68571) _register_openstack_extension /opt/stack/nova/nova/virt/vmwareapi/driver.py:224}} [ 512.676959] env[68571]: INFO nova.compute.provider_config [None req-bb09cf4e-1adc-4aad-a71b-7e6aede2d16c None None] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access. [ 512.695422] env[68571]: DEBUG nova.context [None req-bb09cf4e-1adc-4aad-a71b-7e6aede2d16c None None] Found 2 cells: 00000000-0000-0000-0000-000000000000(cell0),820033e8-12ef-4fe6-9430-254da2e59744(cell1) {{(pid=68571) load_cells /opt/stack/nova/nova/context.py:464}} [ 512.697404] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Acquiring lock "00000000-0000-0000-0000-000000000000" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 512.697628] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Lock "00000000-0000-0000-0000-000000000000" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 512.698331] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Lock "00000000-0000-0000-0000-000000000000" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 512.698724] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Acquiring lock "820033e8-12ef-4fe6-9430-254da2e59744" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 512.698914] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Lock "820033e8-12ef-4fe6-9430-254da2e59744" acquired by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 512.699868] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Lock "820033e8-12ef-4fe6-9430-254da2e59744" "released" by "nova.context.set_target_cell..get_or_set_cached_cell_and_set_connections" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 512.724996] env[68571]: INFO dbcounter [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Registered counter for database nova_cell0 [ 512.733410] env[68571]: INFO dbcounter [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Registered counter for database nova_cell1 [ 512.736347] env[68571]: DEBUG oslo_db.sqlalchemy.engines [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68571) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 512.736704] env[68571]: DEBUG oslo_db.sqlalchemy.engines [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] MySQL server mode set to STRICT_TRANS_TABLES,STRICT_ALL_TABLES,NO_ZERO_IN_DATE,NO_ZERO_DATE,ERROR_FOR_DIVISION_BY_ZERO,TRADITIONAL,NO_AUTO_CREATE_USER,NO_ENGINE_SUBSTITUTION {{(pid=68571) _check_effective_sql_mode /opt/stack/data/venv/lib/python3.10/site-packages/oslo_db/sqlalchemy/engines.py:342}} [ 512.741144] env[68571]: DEBUG dbcounter [-] [68571] Writer thread running {{(pid=68571) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 512.742229] env[68571]: DEBUG dbcounter [-] [68571] Writer thread running {{(pid=68571) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:102}} [ 512.744390] env[68571]: ERROR nova.db.main.api [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 512.744390] env[68571]: result = function(*args, **kwargs) [ 512.744390] env[68571]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 512.744390] env[68571]: return func(*args, **kwargs) [ 512.744390] env[68571]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 512.744390] env[68571]: result = fn(*args, **kwargs) [ 512.744390] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 512.744390] env[68571]: return f(*args, **kwargs) [ 512.744390] env[68571]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 512.744390] env[68571]: return db.service_get_minimum_version(context, binaries) [ 512.744390] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 512.744390] env[68571]: _check_db_access() [ 512.744390] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 512.744390] env[68571]: stacktrace = ''.join(traceback.format_stack()) [ 512.744390] env[68571]: [ 512.745553] env[68571]: ERROR nova.db.main.api [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] No DB access allowed in nova-compute: File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/greenthread.py", line 221, in main [ 512.745553] env[68571]: result = function(*args, **kwargs) [ 512.745553] env[68571]: File "/opt/stack/nova/nova/utils.py", line 664, in context_wrapper [ 512.745553] env[68571]: return func(*args, **kwargs) [ 512.745553] env[68571]: File "/opt/stack/nova/nova/context.py", line 422, in gather_result [ 512.745553] env[68571]: result = fn(*args, **kwargs) [ 512.745553] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 179, in wrapper [ 512.745553] env[68571]: return f(*args, **kwargs) [ 512.745553] env[68571]: File "/opt/stack/nova/nova/objects/service.py", line 548, in _db_service_get_minimum_version [ 512.745553] env[68571]: return db.service_get_minimum_version(context, binaries) [ 512.745553] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 238, in wrapper [ 512.745553] env[68571]: _check_db_access() [ 512.745553] env[68571]: File "/opt/stack/nova/nova/db/main/api.py", line 188, in _check_db_access [ 512.745553] env[68571]: stacktrace = ''.join(traceback.format_stack()) [ 512.745553] env[68571]: [ 512.746135] env[68571]: WARNING nova.objects.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Failed to get minimum service version for cell 00000000-0000-0000-0000-000000000000 [ 512.746135] env[68571]: WARNING nova.objects.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Failed to get minimum service version for cell 820033e8-12ef-4fe6-9430-254da2e59744 [ 512.746548] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Acquiring lock "singleton_lock" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 512.746713] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Acquired lock "singleton_lock" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 512.746967] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Releasing lock "singleton_lock" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 512.747306] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Full set of CONF: {{(pid=68571) _wait_for_exit_or_signal /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/service.py:362}} [ 512.747453] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ******************************************************************************** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2589}} [ 512.747579] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] Configuration options gathered from: {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2590}} [ 512.747713] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] command line args: ['--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-cpu-common.conf', '--config-file', '/etc/nova/nova-cpu-1.conf'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2591}} [ 512.747899] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2592}} [ 512.748037] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ================================================================================ {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2594}} [ 512.748254] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] allow_resize_to_same_host = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.748420] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] arq_binding_timeout = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.748549] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] backdoor_port = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.748673] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] backdoor_socket = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.748833] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] block_device_allocate_retries = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.748999] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] block_device_allocate_retries_interval = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749187] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cert = self.pem {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749355] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute_driver = vmwareapi.VMwareVCDriver {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749522] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute_monitors = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749689] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] config_dir = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749858] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] config_drive_format = iso9660 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.749990] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] config_file = ['/etc/nova/nova.conf', '/etc/nova/nova-cpu-common.conf', '/etc/nova/nova-cpu-1.conf'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750175] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] config_source = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750344] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] console_host = devstack {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750509] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] control_exchange = nova {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750665] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cpu_allocation_ratio = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750820] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] daemon = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.750984] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] debug = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.751155] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_access_ip_network_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.751320] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_availability_zone = nova {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.751473] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_ephemeral_format = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.751629] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_green_pool_size = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.751896] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_log_levels = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752078] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] default_schedule_zone = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752243] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] disk_allocation_ratio = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752403] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] enable_new_services = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752580] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] enabled_apis = ['osapi_compute'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752743] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] enabled_ssl_apis = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.752902] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] flat_injected = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753066] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] force_config_drive = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753230] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] force_raw_images = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753397] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] graceful_shutdown_timeout = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753556] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] heal_instance_info_cache_interval = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753775] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] host = cpu-1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.753972] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] initial_cpu_allocation_ratio = 4.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] initial_disk_allocation_ratio = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] initial_ram_allocation_ratio = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] injected_network_template = /opt/stack/nova/nova/virt/interfaces.template {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_build_timeout = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_delete_interval = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.755948] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_format = [instance: %(uuid)s] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_name_template = instance-%08x {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_usage_audit = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_usage_audit_period = month {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instance_uuid_format = [instance: %(uuid)s] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] instances_path = /opt/stack/data/nova/instances {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756177] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] internal_service_availability_zone = internal {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756334] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] key = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756334] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] live_migration_retry_count = 30 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756566] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_config_append = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756566] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_date_format = %Y-%m-%d %H:%M:%S {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756733] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_dir = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.756891] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757049] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_options = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757225] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_rotate_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757393] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_rotate_interval_type = days {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757557] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] log_rotation_type = none {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757687] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] logging_context_format_string = %(color)s%(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(project_name)s %(user_name)s%(color)s] %(instance)s%(color)s%(message)s {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757822] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] logging_debug_format_suffix = {{(pid=%(process)d) %(funcName)s %(pathname)s:%(lineno)d}} {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.757997] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] logging_default_format_string = %(color)s%(levelname)s %(name)s [-%(color)s] %(instance)s%(color)s%(message)s {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758181] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] logging_exception_prefix = ERROR %(name)s %(instance)s {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758310] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] logging_user_identity_format = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758470] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] long_rpc_timeout = 1800 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758625] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_concurrent_builds = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758780] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_concurrent_live_migrations = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.758936] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_concurrent_snapshots = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759104] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_local_block_devices = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759267] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_logfile_count = 30 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759421] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] max_logfile_size_mb = 200 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759575] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] maximum_instance_delete_attempts = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759741] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metadata_listen = 0.0.0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.759911] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metadata_listen_port = 8775 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760089] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metadata_workers = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760255] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] migrate_max_retries = -1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760424] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] mkisofs_cmd = genisoimage {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760661] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] my_block_storage_ip = 10.180.1.21 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760797] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] my_ip = 10.180.1.21 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.760960] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] network_allocate_retries = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761153] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761320] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] osapi_compute_listen = 0.0.0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761484] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] osapi_compute_listen_port = 8774 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761649] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] osapi_compute_unique_server_name_scope = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761814] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] osapi_compute_workers = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.761973] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] password_length = 12 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762147] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] periodic_enable = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762304] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] periodic_fuzzy_delay = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762472] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] pointer_model = usbtablet {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762635] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] preallocate_images = none {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762791] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] publish_errors = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.762943] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] pybasedir = /opt/stack/nova {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763121] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ram_allocation_ratio = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763281] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rate_limit_burst = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763444] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rate_limit_except_level = CRITICAL {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763598] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rate_limit_interval = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763754] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reboot_timeout = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.763910] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reclaim_instance_interval = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764083] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] record = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764259] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reimage_timeout_per_gb = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764424] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] report_interval = 120 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764583] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rescue_timeout = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764745] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reserved_host_cpus = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.764902] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reserved_host_disk_mb = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765068] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reserved_host_memory_mb = 512 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765230] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] reserved_huge_pages = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765388] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] resize_confirm_window = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765552] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] resize_fs_using_block_device = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765704] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] resume_guests_state_on_host_boot = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.765871] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rootwrap_config = /etc/nova/rootwrap.conf {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766039] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rpc_response_timeout = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766202] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] run_external_periodic_tasks = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766368] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] running_deleted_instance_action = reap {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766523] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] running_deleted_instance_poll_interval = 1800 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766677] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] running_deleted_instance_timeout = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.766835] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler_instance_sync_interval = 120 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767031] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_down_time = 720 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767219] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] servicegroup_driver = db {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767381] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] shelved_offload_time = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767536] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] shelved_poll_interval = 3600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767702] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] shutdown_timeout = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.767861] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] source_is_ipv6 = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768030] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ssl_only = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768295] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] state_path = /opt/stack/data/n-cpu-1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768463] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] sync_power_state_interval = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768620] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] sync_power_state_pool_size = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768788] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] syslog_log_facility = LOG_USER {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.768944] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] tempdir = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769116] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] timeout_nbd = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769285] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] transport_url = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769445] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] update_resources_interval = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769601] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_cow_images = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769758] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_eventlog = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.769911] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_journal = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770078] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_json = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770238] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_rootwrap_daemon = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770392] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_stderr = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770545] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] use_syslog = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770702] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vcpu_pin_set = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.770860] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plugging_is_fatal = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771031] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plugging_timeout = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771201] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] virt_mkfs = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771358] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] volume_usage_poll_interval = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771513] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] watch_log_file = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771679] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] web = /usr/share/spice-html5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2602}} [ 512.771867] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_concurrency.disable_process_locking = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.772180] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_concurrency.lock_path = /opt/stack/data/n-cpu-1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.772367] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_metrics.metrics_buffer_size = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.772533] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_metrics.metrics_enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.772702] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_metrics.metrics_process_name = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.772871] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773041] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773222] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.auth_strategy = keystone {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773390] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.compute_link_prefix = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773564] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773737] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.dhcp_domain = novalocal {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.773906] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.enable_instance_password = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774082] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.glance_link_prefix = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774254] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.instance_list_cells_batch_fixed_size = 100 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774428] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.instance_list_cells_batch_strategy = distributed {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774591] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.instance_list_per_project_cells = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774752] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.list_records_by_skipping_down_cells = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.774916] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.local_metadata_per_cell = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.775098] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.max_limit = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.775273] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.metadata_cache_expiration = 15 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.775512] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.neutron_default_tenant_id = default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.775704] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.use_forwarded_for = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.775877] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.use_neutron_default_nets = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776081] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_dynamic_connect_timeout = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776258] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_dynamic_failure_fatal = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776428] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_dynamic_read_timeout = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776615] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_dynamic_ssl_certfile = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776793] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_dynamic_targets = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.776960] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_jsonfile_path = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.777159] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api.vendordata_providers = ['StaticJSON'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.777356] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.backend = dogpile.cache.memcached {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.777526] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.backend_argument = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.777700] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.config_prefix = cache.oslo {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.777874] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.dead_timeout = 60.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778052] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.debug_cache_backend = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778223] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.enable_retry_client = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778387] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.enable_socket_keepalive = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778558] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.enabled = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778721] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.expiration_time = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.778883] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.hashclient_retry_attempts = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779058] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.hashclient_retry_delay = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779226] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_dead_retry = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779394] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_password = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779555] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_pool_connection_get_timeout = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779718] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_pool_flush_on_reconnect = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.779881] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_pool_maxsize = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780054] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_pool_unused_timeout = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780222] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_sasl_enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780401] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_servers = ['localhost:11211'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780568] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_socket_timeout = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.memcache_username = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.780898] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.proxies = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781071] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.retry_attempts = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781243] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.retry_delay = 0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781406] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.socket_keepalive_count = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781565] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.socket_keepalive_idle = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781727] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.socket_keepalive_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.781883] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.tls_allowed_ciphers = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782051] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.tls_cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782211] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.tls_certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782370] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.tls_enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782526] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cache.tls_keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782692] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.782864] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.auth_type = password {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783036] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783215] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.catalog_info = volumev3::publicURL {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783375] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783537] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783699] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.cross_az_attach = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.783859] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.debug = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784026] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.endpoint_template = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784189] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.http_retries = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784353] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784508] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784681] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.os_region_name = RegionOne {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784841] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.784999] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cinder.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.785186] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.consecutive_build_service_disable_threshold = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.785349] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.cpu_dedicated_set = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.785508] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.cpu_shared_set = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.785672] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.image_type_exclude_list = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.785832] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.live_migration_wait_for_vif_plug = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786040] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.max_concurrent_disk_ops = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786179] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.max_disk_devices_to_attach = -1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786339] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.packing_host_numa_cells_allocation_strategy = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786512] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.provider_config_location = /etc/nova/provider_config/ {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786676] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.resource_provider_association_refresh = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.786877] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.shutdown_retry_interval = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787041] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] compute.vmdk_allowed_types = ['streamOptimized', 'monolithicSparse'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787236] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] conductor.workers = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787415] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] console.allowed_origins = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787576] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] console.ssl_ciphers = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787746] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] console.ssl_minimum_version = default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.787919] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] consoleauth.token_ttl = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788105] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788274] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788438] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788596] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788756] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.788912] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789084] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789248] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789408] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789566] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789720] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.789875] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790056] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.service_type = accelerator {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790223] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790380] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790538] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790692] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.790869] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791036] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] cyborg.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791226] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.backend = sqlalchemy {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791404] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.connection = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791577] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.connection_debug = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791743] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.connection_parameters = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.791901] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.connection_recycle_time = 3600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792078] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.connection_trace = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792248] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.db_inc_retry_interval = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792410] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.db_max_retries = 20 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792571] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.db_max_retry_interval = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792729] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.db_retry_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.792893] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.max_overflow = 50 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.793063] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.max_pool_size = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.793234] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.max_retries = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.793442] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.mysql_sql_mode = TRADITIONAL {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.793556] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.mysql_wsrep_sync_wait = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.793715] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.pool_timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.retry_interval = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.slave_connection = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.sqlite_synchronous = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] database.use_db_reconnect = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.backend = sqlalchemy {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800588] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.connection = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.connection_debug = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.connection_parameters = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.connection_recycle_time = 3600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.connection_trace = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.db_inc_retry_interval = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800805] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.db_max_retries = 20 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.db_max_retry_interval = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.db_retry_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.max_overflow = 50 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.max_pool_size = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.max_retries = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.800955] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.mysql_sql_mode = TRADITIONAL {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.mysql_wsrep_sync_wait = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.pool_timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.retry_interval = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.slave_connection = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] api_database.sqlite_synchronous = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801126] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] devices.enabled_mdev_types = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ephemeral_storage_encryption.cipher = aes-xts-plain64 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ephemeral_storage_encryption.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ephemeral_storage_encryption.key_size = 512 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.api_servers = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801280] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.debug = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.default_trusted_certificate_ids = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801430] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.enable_certificate_validation = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.enable_rbd_download = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.num_retries = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.rbd_ceph_conf = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.rbd_connect_timeout = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.rbd_pool = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.rbd_user = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801734] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801880] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801880] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.service_type = image {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.801928] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802236] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802236] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802372] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802551] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802715] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.verify_glance_signatures = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.802873] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] glance.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803056] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] guestfs.debug = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803232] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.config_drive_cdrom = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803395] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.config_drive_inject_password = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803563] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.dynamic_memory_ratio = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803719] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.enable_instance_metrics_collection = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.803878] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.enable_remotefx = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804085] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.instances_path_share = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804266] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.iscsi_initiator_list = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804428] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.limit_cpu_features = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804589] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.mounted_disk_query_retry_count = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804751] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.mounted_disk_query_retry_interval = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.804909] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.power_state_check_timeframe = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805087] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.power_state_event_polling_interval = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805261] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.qemu_img_cmd = qemu-img.exe {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805422] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.use_multipath_io = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805581] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.volume_attach_retry_count = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805739] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.volume_attach_retry_interval = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.805894] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.vswitch_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.806089] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] hyperv.wait_soft_reboot_seconds = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.806267] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] mks.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.806627] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] mks.mksproxy_base_url = http://127.0.0.1:6090/ {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.806820] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.manager_interval = 2400 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807026] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.precache_concurrency = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807213] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.remove_unused_base_images = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807385] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.remove_unused_original_minimum_age_seconds = 86400 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807552] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.remove_unused_resized_minimum_age_seconds = 3600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807728] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] image_cache.subdirectory_name = _base {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.807904] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.api_max_retries = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808079] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.api_retry_interval = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808251] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808413] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.auth_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808570] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808727] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.808887] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809057] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.conductor_group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809219] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809379] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809536] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809694] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.809849] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810015] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810182] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810348] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.peer_list = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810506] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810667] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.serial_console_state_timeout = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810822] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.810990] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.service_type = baremetal {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811167] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811324] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811485] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811639] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811816] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.811976] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ironic.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.812176] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] key_manager.backend = nova.keymgr.conf_key_mgr.ConfKeyManager {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.812352] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] key_manager.fixed_key = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.812536] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.auth_endpoint = http://localhost/identity/v3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.812699] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.barbican_api_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.812858] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.barbican_endpoint = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813039] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.barbican_endpoint_type = public {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813205] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.barbican_region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813362] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813520] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813684] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.813836] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814025] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814212] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.number_of_retries = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814375] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.retry_delay = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814539] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.send_service_user_token = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814700] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.814857] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815028] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.verify_ssl = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815194] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican.verify_ssl_path = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815359] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815520] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.auth_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815676] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.815835] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816018] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816195] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816356] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816516] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816670] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] barbican_service_user.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.816836] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.approle_role_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817015] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.approle_secret_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817232] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817348] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817516] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817679] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.817841] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818019] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.kv_mountpoint = secret {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818187] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.kv_path = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818352] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.kv_version = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818512] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.namespace = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818668] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.root_token_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818829] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.818988] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.ssl_ca_crt_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819160] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819322] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.use_ssl = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819492] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vault.vault_url = http://127.0.0.1:8200 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819660] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819823] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.auth_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.819982] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820157] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820320] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820477] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820635] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820792] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.820951] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821120] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821277] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821435] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821590] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821744] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.821912] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.service_type = identity {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822082] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822242] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822400] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822554] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822733] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.822894] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] keystone.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823116] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.connection_uri = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823283] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_mode = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823449] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_model_extra_flags = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823619] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_models = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823803] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_power_governor_high = performance {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.823959] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_power_governor_low = powersave {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824137] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_power_management = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824309] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.cpu_power_management_strategy = cpu_state {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824473] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.device_detach_attempts = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824635] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.device_detach_timeout = 20 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824801] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.disk_cachemodes = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.824959] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.disk_prefix = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825140] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.enabled_perf_events = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825308] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.file_backed_memory = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825471] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.gid_maps = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825629] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.hw_disk_discard = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825787] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.hw_machine_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.825994] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_rbd_ceph_conf = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.826175] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_rbd_glance_copy_poll_interval = 15 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.826347] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_rbd_glance_copy_timeout = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.826519] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_rbd_glance_store_name = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.826686] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_rbd_pool = rbd {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.826856] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_type = default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827053] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.images_volume_group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827231] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.inject_key = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827397] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.inject_partition = -2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827558] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.inject_password = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827722] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.iscsi_iface = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.827922] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.iser_use_multipath = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828064] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_bandwidth = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828235] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_completion_timeout = 800 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828400] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_downtime = 500 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828563] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_downtime_delay = 75 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828728] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_downtime_steps = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.828891] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_inbound_addr = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829066] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_permit_auto_converge = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829235] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_permit_post_copy = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829392] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_scheme = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829563] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_timeout_action = abort {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829726] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_tunnelled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.829884] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_uri = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.830057] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.live_migration_with_native_tls = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.830222] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.max_queues = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.830385] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.mem_stats_period_seconds = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.830545] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.nfs_mount_options = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.830876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.nfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831078] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_aoe_discover_tries = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831259] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_iser_scan_tries = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831428] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_memory_encrypted_guests = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831590] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_nvme_discover_tries = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831755] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_pcie_ports = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.831923] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.num_volume_scan_tries = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.832102] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.pmem_namespaces = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.832268] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.quobyte_client_cfg = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.832570] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.quobyte_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.832743] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rbd_connect_timeout = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.832911] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rbd_destroy_volume_retries = 12 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833088] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rbd_destroy_volume_retry_interval = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833252] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rbd_secret_uuid = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833412] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rbd_user = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833577] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.realtime_scheduler_priority = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833749] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.remote_filesystem_transport = ssh {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.833937] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rescue_image_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.834079] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rescue_kernel_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.834241] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rescue_ramdisk_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.834409] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rng_dev_path = /dev/urandom {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.834572] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.rx_queue_size = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.834742] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.smbfs_mount_options = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835034] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.smbfs_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835216] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.snapshot_compression = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835380] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.snapshot_image_format = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835603] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.snapshots_directory = /opt/stack/data/nova/instances/snapshots {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835771] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.sparse_logical_volumes = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.835950] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.swtpm_enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836146] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.swtpm_group = tss {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836324] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.swtpm_user = tss {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836495] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.sysinfo_serial = unique {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836654] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.tb_cache_size = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836811] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.tx_queue_size = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.836995] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.uid_maps = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.837209] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.use_virtio_for_bridges = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.837389] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.virt_type = kvm {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.837565] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.volume_clear = zero {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.837731] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.volume_clear_size = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.837898] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.volume_use_multipath = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.838071] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_cache_path = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.838247] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_log_path = /var/log/vstorage/%(cluster_name)s/nova.log.gz {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.838416] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_mount_group = qemu {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.838583] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_mount_opts = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.838753] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_mount_perms = 0770 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839048] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_mount_point_base = /opt/stack/data/n-cpu-1/mnt {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839235] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.vzstorage_mount_user = stack {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839406] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] libvirt.wait_soft_reboot_seconds = 120 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839582] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839758] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.auth_type = password {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.839922] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840096] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840267] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840425] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840587] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840761] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.default_floating_pool = public {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.840923] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.841109] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.extension_sync_interval = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.841277] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.http_retries = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.841485] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.841661] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.841825] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842007] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.metadata_proxy_shared_secret = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842191] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842365] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.ovs_bridge = br-int {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842532] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.physnets = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842705] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.region_name = RegionOne {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.842876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.service_metadata_proxy = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843048] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843226] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.service_type = network {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843393] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843554] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843715] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.843876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.844127] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.844315] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] neutron.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.844511] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] notifications.bdms_in_notifications = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.844707] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] notifications.default_level = INFO {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.844888] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] notifications.notification_format = unversioned {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845068] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] notifications.notify_on_state_change = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845255] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] notifications.versioned_notifications_topics = ['versioned_notifications'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845437] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] pci.alias = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845610] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] pci.device_spec = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845776] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] pci.report_in_placement = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.845976] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.846171] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.auth_type = password {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.846348] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.auth_url = http://10.180.1.21/identity {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.846512] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.846673] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.846837] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847034] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847231] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847398] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.default_domain_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847575] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.default_domain_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847719] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.domain_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.847876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.domain_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848045] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848216] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848374] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848530] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848686] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.848854] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.password = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849026] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.project_domain_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849195] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.project_domain_name = Default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849364] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.project_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849538] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.project_name = service {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849706] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.region_name = RegionOne {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.849867] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850049] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.service_type = placement {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850223] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850380] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850540] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850700] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.system_scope = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.850857] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851026] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.trust_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851229] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.user_domain_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851414] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.user_domain_name = Default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851575] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.user_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851751] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.username = placement {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.851935] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.valid_interfaces = ['internal', 'public'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852109] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] placement.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852328] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.cores = 20 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852458] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.count_usage_from_placement = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852630] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.driver = nova.quota.DbQuotaDriver {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852800] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.injected_file_content_bytes = 10240 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.852967] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.injected_file_path_length = 255 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.853150] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.injected_files = 5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.853321] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.instances = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.853531] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.key_pairs = 100 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.853710] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.metadata_items = 128 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.853883] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.ram = 51200 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.854060] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.recheck_quota = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.854238] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.server_group_members = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.854404] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] quota.server_groups = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.854573] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rdp.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.854884] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] rdp.html5_proxy_base_url = http://127.0.0.1:6083/ {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855080] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.discover_hosts_in_cells_interval = -1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855255] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.enable_isolated_aggregate_filtering = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855421] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.image_metadata_prefilter = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855586] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.limit_tenants_to_placement_aggregate = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855751] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.max_attempts = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.855915] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.max_placement_results = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.856122] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.placement_aggregate_required_for_tenants = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.856297] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.query_placement_for_image_type_support = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.856462] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.query_placement_for_routed_network_aggregates = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.856641] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] scheduler.workers = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.856816] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.aggregate_image_properties_isolation_namespace = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857028] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.aggregate_image_properties_isolation_separator = . {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857247] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857433] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.build_failure_weight_multiplier = 1000000.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857604] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.cpu_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857773] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.857939] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.disk_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858151] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter', 'SameHostFilter', 'DifferentHostFilter'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858326] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.host_subset_size = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858496] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858659] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.image_properties_default_architecture = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858826] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.io_ops_weight_multiplier = -1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.858996] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.isolated_hosts = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.859180] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.isolated_images = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.859347] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.max_instances_per_host = 50 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.859509] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.max_io_ops_per_host = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.859676] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.num_instances_weight_multiplier = 0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.859839] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.pci_in_placement = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860009] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.pci_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860204] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.ram_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860386] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860554] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.shuffle_best_same_weighed_hosts = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860721] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.soft_affinity_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.860885] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.861061] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.track_instance_changes = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.861317] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.861509] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metrics.required = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.861684] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metrics.weight_multiplier = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.861851] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metrics.weight_of_unavailable = -10000.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.862030] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] metrics.weight_setting = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.862342] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.base_url = ws://127.0.0.1:6083/ {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.862522] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.862705] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.port_range = 10000:20000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.862880] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.proxyclient_address = 127.0.0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.863111] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.serialproxy_host = 0.0.0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.863323] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] serial_console.serialproxy_port = 6083 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.863554] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.863773] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.auth_type = password {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.864011] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.864316] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.864503] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.864705] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.864920] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.865188] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.send_service_user_token = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.865433] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.865642] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] service_user.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.865829] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.agent_enabled = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.866040] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.866351] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.html5proxy_base_url = http://127.0.0.1:6082/spice_auto.html {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.866550] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.html5proxy_host = 0.0.0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.866723] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.html5proxy_port = 6082 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.866886] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.image_compression = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867061] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.jpeg_compression = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867228] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.playback_compression = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867401] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.server_listen = 127.0.0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867571] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.server_proxyclient_address = 127.0.0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867761] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.streaming_mode = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.867890] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] spice.zlib_compression = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868068] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] upgrade_levels.baseapi = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868277] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] upgrade_levels.cert = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868413] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] upgrade_levels.compute = auto {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868572] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] upgrade_levels.conductor = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868729] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] upgrade_levels.scheduler = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.868894] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869070] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.auth_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869237] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869397] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869558] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869719] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.869876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870050] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870223] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vendordata_dynamic_auth.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870396] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.api_retry_count = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870559] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.ca_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870733] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.cache_prefix = devstack-image-cache {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.870901] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.cluster_name = testcl1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871074] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.connection_pool_size = 10 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871240] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.console_delay_seconds = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871407] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.datastore_regex = ^datastore.* {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871612] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.host_ip = vc1.osci.c.eu-de-1.cloud.sap {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871787] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.host_password = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.871954] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.host_port = 443 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872138] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.host_username = administrator@vsphere.local {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872309] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.insecure = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872471] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.integration_bridge = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872634] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.maximum_objects = 100 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872793] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.pbm_default_policy = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.872956] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.pbm_enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873130] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.pbm_wsdl_location = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873302] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.serial_log_dir = /opt/vmware/vspc {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873461] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.serial_port_proxy_uri = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873619] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.serial_port_service_uri = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873789] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.task_poll_interval = 0.5 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.873984] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.use_linked_clone = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.874180] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.vnc_keymap = en-us {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.874350] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.vnc_port = 5900 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.874517] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vmware.vnc_port_total = 10000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.874701] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.auth_schemes = ['none'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.874877] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.875190] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.novncproxy_base_url = http://127.0.0.1:6080/vnc_auto.html {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.875383] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.novncproxy_host = 0.0.0.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.875558] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.novncproxy_port = 6080 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.875737] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.server_listen = 127.0.0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.875911] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.server_proxyclient_address = 127.0.0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876115] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.vencrypt_ca_certs = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876286] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.vencrypt_client_cert = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876446] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vnc.vencrypt_client_key = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876619] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_compute_service_check_for_ffu = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876785] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_deep_image_inspection = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.876960] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_fallback_pcpu_query = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.877191] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_group_policy_check_upcall = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.877375] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_libvirt_livesnapshot = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.877544] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.disable_rootwrap = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.877708] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.enable_numa_live_migration = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.877873] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.enable_qemu_monitor_announce_self = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878046] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878219] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.handle_virt_lifecycle_events = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878382] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.libvirt_disable_apic = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878545] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.never_download_image_if_on_rbd = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878706] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.qemu_monitor_announce_self_count = 3 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.878868] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.qemu_monitor_announce_self_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879039] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.reserve_disk_resource_for_image_cache = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879208] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.skip_cpu_compare_at_startup = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879370] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.skip_cpu_compare_on_dest = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879529] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.skip_hypervisor_version_check_on_lm = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879691] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.skip_reserve_in_use_ironic_nodes = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.879852] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.unified_limits_count_pcpu_as_vcpu = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880025] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880219] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.api_paste_config = /etc/nova/api-paste.ini {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880390] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.client_socket_timeout = 900 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880558] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.default_pool_size = 1000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880725] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.keep_alive = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.880890] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.max_header_line = 16384 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881065] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.secure_proxy_ssl_header = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881290] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.ssl_ca_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881469] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.ssl_cert_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881632] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.ssl_key_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881801] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.tcp_keepidle = 600 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.881980] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] wsgi.wsgi_log_format = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.882168] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] zvm.ca_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.882333] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] zvm.cloud_connector_url = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.882620] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] zvm.image_tmp_path = /opt/stack/data/n-cpu-1/images {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.882796] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] zvm.reachable_timeout = 300 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.882978] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.enforce_new_defaults = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.883165] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.enforce_scope = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.883345] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.policy_default_rule = default {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.883527] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.policy_dirs = ['policy.d'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.883704] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.policy_file = policy.yaml {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.883877] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.remote_content_type = application/x-www-form-urlencoded {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884076] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.remote_ssl_ca_crt_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884252] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.remote_ssl_client_crt_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884412] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.remote_ssl_client_key_file = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884585] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_policy.remote_ssl_verify_server_crt = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884742] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_versionedobjects.fatal_exception_format_errors = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.884918] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885110] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.connection_string = messaging:// {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885284] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.enabled = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885456] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.es_doc_type = notification {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885618] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.es_scroll_size = 10000 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885786] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.es_scroll_time = 2m {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.885959] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.filter_error_trace = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.886162] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.hmac_keys = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.886340] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.sentinel_service_name = mymaster {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.886509] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.socket_timeout = 0.1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.886673] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.trace_requests = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.886832] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler.trace_sqlalchemy = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887040] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler_jaeger.process_tags = {} {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887234] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler_jaeger.service_name_prefix = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887406] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] profiler_otlp.service_name_prefix = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887574] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] remote_debug.host = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887732] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] remote_debug.port = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.887960] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.amqp_auto_delete = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888090] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.amqp_durable_queues = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888259] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.conn_pool_min_size = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888423] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.conn_pool_ttl = 1200 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888587] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.direct_mandatory_flag = True {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888749] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.enable_cancel_on_failover = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.888911] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.heartbeat_in_pthread = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889088] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.heartbeat_rate = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889296] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889464] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.kombu_compression = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889638] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.kombu_failover_strategy = round-robin {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889809] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.889981] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.890165] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_ha_queues = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.890333] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_interval_max = 30 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.890519] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.890687] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.890851] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891028] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891203] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891369] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_quorum_queue = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891537] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_retry_backoff = 2 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891700] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_retry_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.891864] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892041] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.rpc_conn_pool_size = 30 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892216] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892392] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl_ca_file = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892565] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl_cert_file = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892731] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl_enforce_fips_mode = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.892905] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl_key_file = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.893089] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_rabbit.ssl_version = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.893282] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_notifications.driver = ['messagingv2'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.893451] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_notifications.retry = -1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.893638] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_notifications.topics = ['notifications'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.893815] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_messaging_notifications.transport_url = **** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894019] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.auth_section = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894207] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.auth_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894374] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.cafile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894546] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.certfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894715] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.collect_timing = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.894876] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.connect_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895050] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.connect_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895218] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.endpoint_id = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895378] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.endpoint_override = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895540] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.insecure = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895698] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.keyfile = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.895855] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.max_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896055] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.min_version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896236] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.region_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896400] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.service_name = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896559] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.service_type = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896720] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.split_loggers = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.896879] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.status_code_retries = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897108] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.status_code_retry_delay = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897293] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.timeout = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897455] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.valid_interfaces = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897615] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_limit.version = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897781] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_reports.file_event_handler = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.897946] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_reports.file_event_handler_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898124] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] oslo_reports.log_dir = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898299] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.capabilities = [12] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898461] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898621] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.helper_command = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898787] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.898951] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.thread_pool_size = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899125] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_linux_bridge_privileged.user = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899297] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.capabilities = [12, 1] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899458] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899618] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.helper_command = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899783] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.899944] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.thread_pool_size = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900116] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] vif_plug_ovs_privileged.user = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900287] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.flat_interface = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900466] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.forward_bridge_interface = ['all'] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900640] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.iptables_bottom_regex = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900811] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.iptables_drop_action = DROP {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.900982] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.iptables_top_regex = {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.901184] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.network_device_mtu = 1500 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.901381] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.use_ipv6 = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.901550] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_linux_bridge.vlan_interface = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.901731] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.default_qos_type = linux-noop {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.901904] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.isolate_vif = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902083] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.network_device_mtu = 1500 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902256] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.ovs_vsctl_timeout = 120 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902429] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.ovsdb_connection = tcp:127.0.0.1:6640 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902599] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.ovsdb_interface = native {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902761] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_vif_ovs.per_port_bridge = False {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.902931] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_brick.lock_path = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903121] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_brick.wait_mpath_device_attempts = 4 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903297] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] os_brick.wait_mpath_device_interval = 1 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903470] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.capabilities = [21] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903633] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903791] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.helper_command = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.903970] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.logger_name = os_brick.privileged {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904164] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.thread_pool_size = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904327] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] privsep_osbrick.user = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904503] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.capabilities = [0, 1, 2, 3, 12, 21] {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904663] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.group = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904823] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.helper_command = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.904991] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.logger_name = oslo_privsep.daemon {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.905172] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.thread_pool_size = 8 {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.905331] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] nova_sys_admin.user = None {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2609}} [ 512.905462] env[68571]: DEBUG oslo_service.service [None req-0aaa08a6-b508-4584-8b0b-cc9fbbf5805a None None] ******************************************************************************** {{(pid=68571) log_opt_values /opt/stack/data/venv/lib/python3.10/site-packages/oslo_config/cfg.py:2613}} [ 512.905878] env[68571]: INFO nova.service [-] Starting compute node (version 0.0.1) [ 512.915839] env[68571]: WARNING nova.virt.vmwareapi.driver [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] The vmwareapi driver is not tested by the OpenStack project nor does it have clear maintainer(s) and thus its quality can not be ensured. It should be considered experimental and may be removed in a future release. If you are using the driver in production please let us know via the openstack-discuss mailing list. [ 512.916314] env[68571]: INFO nova.virt.node [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Generated node identity 00d803b3-09f1-4a26-8896-bee0c6f9c5dd [ 512.916537] env[68571]: INFO nova.virt.node [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Wrote node identity 00d803b3-09f1-4a26-8896-bee0c6f9c5dd to /opt/stack/data/n-cpu-1/compute_id [ 512.930986] env[68571]: WARNING nova.compute.manager [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Compute nodes ['00d803b3-09f1-4a26-8896-bee0c6f9c5dd'] for host cpu-1 were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning. [ 512.975301] env[68571]: INFO nova.compute.manager [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host [ 512.997693] env[68571]: WARNING nova.compute.manager [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] No compute node record found for host cpu-1. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host cpu-1 could not be found. [ 512.998032] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 512.998326] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 512.998541] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 512.998763] env[68571]: DEBUG nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 513.000406] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8e0625b-2ccc-4bd5-8232-e2ecb9d37436 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.011770] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05eaf940-ff30-4220-b0c0-c230aa153870 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.029684] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8c7b6f5-10bc-48cf-a3e3-0a1dc8dab3c8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.036017] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deeea30b-0b9e-406b-bb84-b3034396b5aa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.064780] env[68571]: DEBUG nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180944MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 513.064884] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 513.065078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 513.077218] env[68571]: WARNING nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] No compute node record for cpu-1:00d803b3-09f1-4a26-8896-bee0c6f9c5dd: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 00d803b3-09f1-4a26-8896-bee0c6f9c5dd could not be found. [ 513.091239] env[68571]: INFO nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Compute node record created for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 with uuid: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd [ 513.143351] env[68571]: DEBUG nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Total usable vcpus: 48, total allocated vcpus: 0 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 513.143593] env[68571]: DEBUG nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=512MB phys_disk=200GB used_disk=0GB total_vcpus=48 used_vcpus=0 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 513.252212] env[68571]: INFO nova.scheduler.client.report [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] [req-367577ca-7573-41df-913a-8c02dd69b136] Created resource provider record via placement API for resource provider with UUID 00d803b3-09f1-4a26-8896-bee0c6f9c5dd and name domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28. [ 513.268566] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e3e5ea3-60e5-4105-8dbb-e1adb380eb0b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.276201] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7584bcdb-f52a-44ab-a12d-967fef0671f9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.304776] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b69be24d-3bed-48aa-a899-53332b53d50e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.311297] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c409a02a-453e-416b-b1e8-a27ccd96b088 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 513.323700] env[68571]: DEBUG nova.compute.provider_tree [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 513.367461] env[68571]: DEBUG nova.scheduler.client.report [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Updated inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with generation 0 in Placement from set_inventory_for_provider using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:957}} [ 513.367680] env[68571]: DEBUG nova.compute.provider_tree [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Updating resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd generation from 0 to 1 during operation: update_inventory {{(pid=68571) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 513.367832] env[68571]: DEBUG nova.compute.provider_tree [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 513.415914] env[68571]: DEBUG nova.compute.provider_tree [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Updating resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd generation from 1 to 2 during operation: update_traits {{(pid=68571) _update_generation /opt/stack/nova/nova/compute/provider_tree.py:164}} [ 513.433428] env[68571]: DEBUG nova.compute.resource_tracker [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 513.433623] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.369s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 513.433788] env[68571]: DEBUG nova.service [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Creating RPC server for service compute {{(pid=68571) start /opt/stack/nova/nova/service.py:182}} [ 513.446529] env[68571]: DEBUG nova.service [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] Join ServiceGroup membership for this service compute {{(pid=68571) start /opt/stack/nova/nova/service.py:199}} [ 513.446722] env[68571]: DEBUG nova.servicegroup.drivers.db [None req-3bcb742f-99bc-451a-ad49-3b50ebb37e1f None None] DB_Driver: join new ServiceGroup member cpu-1 to the compute group, service = {{(pid=68571) join /opt/stack/nova/nova/servicegroup/drivers/db.py:44}} [ 514.448812] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 514.459930] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 514.459930] env[68571]: value = "domain-c8" [ 514.459930] env[68571]: _type = "ClusterComputeResource" [ 514.459930] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 514.461052] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe1c24dc-d01b-4073-8329-92a6efac89fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.470216] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 0 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 514.470436] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 514.470734] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 514.470734] env[68571]: value = "domain-c8" [ 514.470734] env[68571]: _type = "ClusterComputeResource" [ 514.470734] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 514.471648] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3764011d-4daf-4c92-baf9-8fb219ca1e9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 514.479508] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 0 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 522.743322] env[68571]: DEBUG dbcounter [-] [68571] Writing DB stats nova_cell0:SELECT=1 {{(pid=68571) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 522.745120] env[68571]: DEBUG dbcounter [-] [68571] Writing DB stats nova_cell1:SELECT=1 {{(pid=68571) stat_writer /opt/stack/data/venv/lib/python3.10/site-packages/dbcounter.py:115}} [ 550.524948] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 550.524948] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 550.556140] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 550.685731] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 550.686041] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 550.688466] env[68571]: INFO nova.compute.claims [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 550.872808] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-679be395-aff7-44d8-8fc3-3ad679f32169 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.884950] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c29ba74e-ab46-45d1-87de-54c6287214ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.920633] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8875cf92-e20a-4236-866e-243558d3fbaf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.929608] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55ae0771-6d13-48bb-9bbd-c9db37094978 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 550.949271] env[68571]: DEBUG nova.compute.provider_tree [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 550.968867] env[68571]: DEBUG nova.scheduler.client.report [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 550.987982] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.302s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 550.989179] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 551.051701] env[68571]: DEBUG nova.compute.utils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 551.055041] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 551.055041] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 551.083236] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 551.260603] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 551.512171] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 551.512742] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 551.560038] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 551.576773] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 551.576773] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 551.576773] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 551.577090] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 551.577090] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 551.577090] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 551.577090] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 551.577090] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 551.577251] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 551.577251] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 551.577251] env[68571]: DEBUG nova.virt.hardware [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 551.585928] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f52c06c8-fda5-4231-b521-d3a15165ca9f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.599459] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95634ea9-a914-4aae-b26f-e2c68e0690ba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.630449] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed32c8f1-3d65-474d-81ab-2c518d38f9ee {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.661851] env[68571]: DEBUG nova.policy [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd07cf307b20444739ac3b45f645d5a06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93bc72f3b9714240946b1295a142f5ee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 551.696805] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 551.697188] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 551.698853] env[68571]: INFO nova.compute.claims [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 551.813620] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a789cbb7-8590-4080-9bdb-315b9a660d50 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.822036] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9492332b-51ef-444f-8102-7036a0396644 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.854179] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3668e60c-1f38-49d4-bd24-6ea7d0bdf5bd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.862436] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b1a01e-5e55-451a-9219-d3e041710b69 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 551.879279] env[68571]: DEBUG nova.compute.provider_tree [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 551.896539] env[68571]: DEBUG nova.scheduler.client.report [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 551.915180] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.218s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 551.916526] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 551.972759] env[68571]: DEBUG nova.compute.utils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 551.975630] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 551.975855] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 551.994311] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 552.093091] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 552.123714] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 552.123953] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 552.124120] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 552.124298] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 552.124461] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 552.124604] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 552.124806] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 552.124953] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 552.126178] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 552.126369] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 552.128564] env[68571]: DEBUG nova.virt.hardware [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 552.130239] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-017161fa-33b5-4276-b7a0-7849e5505107 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.147021] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2941c6e7-8a45-40b8-8866-f78e4e6cde27 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 552.184957] env[68571]: DEBUG nova.policy [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2b905e6be4f543a481f1ed16af8fd414', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '183f8b174ca5456a8143a91019cae44e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 552.743525] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Successfully created port: 627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 553.037298] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.037616] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.050435] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 553.143463] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.143463] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.150613] env[68571]: INFO nova.compute.claims [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 553.317604] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e309b8a5-005f-44f4-a8af-bb83cd776d44 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.325884] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-358543e5-cb69-4143-9838-6a7096e9f23e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.365113] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-484effc0-12ec-483e-a134-bbed5ce36d18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.374766] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3866108f-800e-4ea9-88f6-19a5c6580e6a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.389303] env[68571]: DEBUG nova.compute.provider_tree [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 553.414329] env[68571]: DEBUG nova.scheduler.client.report [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 553.452035] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.306s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 553.452035] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 553.531205] env[68571]: DEBUG nova.compute.utils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 553.541397] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 553.541397] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 553.548959] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 553.660768] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 553.698962] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 553.698962] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 553.699357] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 553.699579] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 553.699734] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 553.700197] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 553.700197] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 553.700743] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 553.700743] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 553.700992] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 553.701192] env[68571]: DEBUG nova.virt.hardware [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 553.702607] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ec3b61d-3b28-40cb-bb31-0d712a53bcc0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.713375] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd00d8b-3a22-40c6-bde5-665cb36c9231 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 553.808248] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Successfully created port: f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 553.815327] env[68571]: DEBUG nova.policy [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f340966281a848b783b75a5c89986e6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '077f1a9875da491ab41f825a6faab831', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 553.936496] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 553.936817] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 553.960601] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 554.126635] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 554.126935] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 554.134538] env[68571]: INFO nova.compute.claims [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 554.327357] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e76eed5-5d01-4a3b-b642-519283ada1ef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.338043] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4b9e4184-2e3a-4418-a092-4b672f5ebaa1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.382086] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f919d9ec-954f-443f-bd22-b7b35a8067d7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.391560] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4afaccdb-b2a1-4e08-9143-930d5a7643b9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.406561] env[68571]: DEBUG nova.compute.provider_tree [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 554.429043] env[68571]: DEBUG nova.scheduler.client.report [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 554.452912] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.326s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 554.453569] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 554.520826] env[68571]: DEBUG nova.compute.utils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 554.523212] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 554.523535] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 554.557242] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 554.667917] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 554.695328] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 554.695580] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 554.695864] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 554.696752] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 554.698618] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 554.699314] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 554.699640] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 554.699754] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 554.700305] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 554.700305] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 554.700305] env[68571]: DEBUG nova.virt.hardware [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 554.701233] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd971694-b478-4b17-a215-adde793f9d70 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.711733] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d3019c9-ce46-4e67-b13c-ce2a639e48d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 554.740506] env[68571]: DEBUG nova.policy [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7e2186362c2240229839c5f306f036dd', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '22d74ff9ceb5495e8524af57fb9ae473', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 554.753932] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "4c5c97bc-4a9f-413b-a75f-a197270103a2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 554.754133] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "4c5c97bc-4a9f-413b-a75f-a197270103a2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 554.768739] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 554.832242] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 554.832527] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 554.834179] env[68571]: INFO nova.compute.claims [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 555.004612] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26616020-f401-4aa8-b50b-5ed5068ea430 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.012627] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-116dd4fa-cbd5-40cb-ac50-e25f45c633ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.049466] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62bcdc75-fd41-4e0e-80fd-b078e831335a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.057607] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfb89748-58e6-4afc-9d22-1571f86f275f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.072535] env[68571]: DEBUG nova.compute.provider_tree [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 555.086606] env[68571]: DEBUG nova.scheduler.client.report [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 555.106562] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.274s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 555.106936] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 555.167111] env[68571]: DEBUG nova.compute.utils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 555.168578] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Not allocating networking since 'none' was specified. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 555.181239] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 555.276454] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 555.312032] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 555.312032] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 555.312032] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 555.312244] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 555.312244] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 555.312244] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 555.312576] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 555.312901] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 555.313196] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 555.313487] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 555.314153] env[68571]: DEBUG nova.virt.hardware [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 555.315465] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b31d2876-b285-4166-b880-d3d8649724e1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.324747] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5782eb2b-8be4-4a9c-9b30-1bb687464241 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.333626] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Successfully updated port: 627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 555.344564] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Instance VIF info [] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 555.355878] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating folder: OpenStack. Parent ref: group-v4. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.357220] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-dc4811a5-da6f-42ea-99a3-563b3016045f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.359426] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.359688] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquired lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.359934] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 555.374245] env[68571]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 555.374580] env[68571]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=68571) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 555.375258] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Folder already exists: OpenStack. Parent ref: group-v4. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 555.376026] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating folder: Project (cce7a31d2bdd452991c76e4da3dd1582). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.376648] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-30cae027-7197-4cfd-96b0-dc6397f9b18a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.386607] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Created folder: Project (cce7a31d2bdd452991c76e4da3dd1582) in parent group-v692787. [ 555.387156] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating folder: Instances. Parent ref: group-v692791. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 555.387508] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f43311fd-0542-4d93-ab9e-2415c4d10241 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.396860] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Created folder: Instances in parent group-v692791. [ 555.397269] env[68571]: DEBUG oslo.service.loopingcall [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 555.397586] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 555.399120] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-dcb7a677-b1f8-4ec8-a34d-f8ae58a5434a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.423030] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 555.423030] env[68571]: value = "task-3467553" [ 555.423030] env[68571]: _type = "Task" [ 555.423030] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.431338] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467553, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 555.564142] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 555.595241] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Successfully created port: 5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 555.935624] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467553, 'name': CreateVM_Task, 'duration_secs': 0.327997} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 555.935857] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 555.937140] env[68571]: DEBUG oslo_vmware.service [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1d8044c3-33c5-48dc-b134-687ed617d31a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.945934] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 555.946077] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 555.949907] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 555.949907] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9bd7fc37-0109-47e3-9006-f13908a90dda {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 555.952512] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for the task: (returnval){ [ 555.952512] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52243383-28f7-fc7f-af43-055af77144d4" [ 555.952512] env[68571]: _type = "Task" [ 555.952512] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 555.960788] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52243383-28f7-fc7f-af43-055af77144d4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 556.447142] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Updating instance_info_cache with network_info: [{"id": "627e2aef-0903-47cd-8216-b5373437cb33", "address": "fa:16:3e:a9:a9:93", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap627e2aef-09", "ovs_interfaceid": "627e2aef-0903-47cd-8216-b5373437cb33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 556.465085] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 556.465739] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 556.465861] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 556.466017] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 556.466437] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 556.466937] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-25c3c090-f7e7-4627-8a7d-ed5eac4f943b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.475836] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Releasing lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 556.476143] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Instance network_info: |[{"id": "627e2aef-0903-47cd-8216-b5373437cb33", "address": "fa:16:3e:a9:a9:93", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap627e2aef-09", "ovs_interfaceid": "627e2aef-0903-47cd-8216-b5373437cb33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 556.476877] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:a9:a9:93', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '627e2aef-0903-47cd-8216-b5373437cb33', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 556.488508] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Creating folder: Project (183f8b174ca5456a8143a91019cae44e). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 556.491827] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-99359ac0-c86e-441f-b60a-9bbcdd10c627 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.492793] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 556.492992] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 556.493821] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fd926dd-513d-4ab9-b64a-9c786d25d18c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.507162] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Created folder: Project (183f8b174ca5456a8143a91019cae44e) in parent group-v692787. [ 556.507162] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Creating folder: Instances. Parent ref: group-v692794. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 556.508465] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a0f540bb-418a-48ee-955a-dab09490f188 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.513374] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-233dbd13-184e-43b6-a18f-558ef83532bf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.520331] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for the task: (returnval){ [ 556.520331] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52cad533-e04c-6032-9146-f556d220e58c" [ 556.520331] env[68571]: _type = "Task" [ 556.520331] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 556.521663] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Created folder: Instances in parent group-v692794. [ 556.521914] env[68571]: DEBUG oslo.service.loopingcall [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 556.525968] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 556.525968] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-48a582bc-1e95-44d8-8168-6a2209b64b2f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.550052] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 556.550309] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating directory with path [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 556.551310] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 556.551310] env[68571]: value = "task-3467556" [ 556.551310] env[68571]: _type = "Task" [ 556.551310] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 556.551310] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-194dc12c-584a-401d-a54b-22c94e83c6d7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.563722] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467556, 'name': CreateVM_Task} progress is 6%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 556.576190] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Created directory with path [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 556.577392] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Fetch image to [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 556.577689] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 556.581306] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cead2460-a76f-40ac-a757-6eb195833713 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.588286] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79c65e39-819e-4180-959a-cea4d245ddaa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.601497] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9d8ab04-1c07-4da0-a2ec-f6a94f0c03e5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.651836] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3612dd54-7d19-48d6-8b77-b399a96fe7be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.661247] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c2711f0d-d4e4-4155-8d1d-516ce1d4f092 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 556.674116] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Successfully created port: b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 556.689824] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 556.770567] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 556.838387] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 556.838476] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 557.068088] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467556, 'name': CreateVM_Task, 'duration_secs': 0.369858} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 557.068337] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 557.091245] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 557.091431] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 557.091994] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 557.094941] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-837605c3-a4f0-4f59-b5cd-e1971203f27b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.102175] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Waiting for the task: (returnval){ [ 557.102175] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]528e651a-8f83-1448-79c3-49d61c3c8dde" [ 557.102175] env[68571]: _type = "Task" [ 557.102175] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 557.110747] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]528e651a-8f83-1448-79c3-49d61c3c8dde, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 557.151020] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 557.151934] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 557.170299] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 557.272745] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 557.273241] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 557.275450] env[68571]: INFO nova.compute.claims [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 557.362200] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "8c30562a-4a81-4007-923c-3bc0b922f01c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 557.362473] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "8c30562a-4a81-4007-923c-3bc0b922f01c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 557.381431] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 557.470504] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 557.498389] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81d30c23-02f1-4516-b4c9-a00d1fd5aa1f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.508038] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d28de76e-a94e-43b6-9ad9-c18df475894f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.541357] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6eda65e0-2f64-424c-9a72-e3850fe16f53 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.549108] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f93b9e63-a69d-40df-b643-01c7ca6a328a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.562937] env[68571]: DEBUG nova.compute.provider_tree [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 557.575253] env[68571]: DEBUG nova.scheduler.client.report [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 557.592226] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.319s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 557.592753] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 557.597418] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.125s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 557.597418] env[68571]: INFO nova.compute.claims [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 557.614361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 557.614505] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 557.614659] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 557.644231] env[68571]: DEBUG nova.compute.utils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 557.645753] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 557.645937] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 557.662321] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 557.763255] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 557.798164] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 557.798164] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 557.798164] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 557.798464] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 557.798464] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 557.798464] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 557.798464] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 557.798464] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 557.798659] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 557.798659] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 557.798799] env[68571]: DEBUG nova.virt.hardware [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 557.801372] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd9a220b-8846-4400-8e77-db86dfa65e80 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.813394] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cebdd9e0-e6f2-41f3-b444-55f9d9373370 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.848166] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba468e0e-521d-46b0-b60a-4b66971c701d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.855898] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ecb9e52f-f524-4380-ab44-310feac56c83 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.861820] env[68571]: DEBUG nova.policy [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f487141eae96491480b072daf61453d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08c02345327245a99a5bb11408b51c6d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 557.866800] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Successfully updated port: 5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 557.909402] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 557.909402] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 557.909402] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 557.910159] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97f32aa9-e903-404b-ada6-8ae6adb249e8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.921318] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20f69762-e57b-4bb5-bf39-acfb30f4fd8c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 557.943043] env[68571]: DEBUG nova.compute.provider_tree [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 557.961713] env[68571]: DEBUG nova.scheduler.client.report [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 557.994419] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.399s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 557.994966] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 558.058104] env[68571]: DEBUG nova.compute.utils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 558.060718] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Not allocating networking since 'none' was specified. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 558.073180] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 558.081387] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.154372] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 558.191391] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 558.192212] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 558.192665] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 558.193049] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 558.193049] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 558.193156] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 558.193497] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 558.193497] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 558.193869] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 558.193869] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 558.194020] env[68571]: DEBUG nova.virt.hardware [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 558.195634] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bd3f753-cf3a-47a1-afcb-dfaf1baf7475 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.205805] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6de71b6c-3626-463b-9766-d3334805964a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.224626] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Instance VIF info [] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 558.231756] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Creating folder: Project (5b00f1d6e8d3409f8cc056b449cee391). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.232973] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0442b195-b3f2-4b02-a8be-4b6a0b67106b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.247904] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Created folder: Project (5b00f1d6e8d3409f8cc056b449cee391) in parent group-v692787. [ 558.248136] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Creating folder: Instances. Parent ref: group-v692797. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.248388] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-bfa223f4-85e6-49c8-a90b-fed64b7fdd64 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.258175] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Created folder: Instances in parent group-v692797. [ 558.258462] env[68571]: DEBUG oslo.service.loopingcall [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 558.258652] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 558.259470] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Successfully updated port: f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.261811] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-4ef7e97a-a68a-48c1-bbb0-cd6214870819 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.283330] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 558.283469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 558.283621] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 558.290658] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 558.290658] env[68571]: value = "task-3467560" [ 558.290658] env[68571]: _type = "Task" [ 558.290658] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 558.300774] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467560, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 558.457207] env[68571]: DEBUG nova.compute.manager [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Received event network-vif-plugged-627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 558.457366] env[68571]: DEBUG oslo_concurrency.lockutils [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] Acquiring lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 558.457509] env[68571]: DEBUG oslo_concurrency.lockutils [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] Lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 558.457740] env[68571]: DEBUG oslo_concurrency.lockutils [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] Lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 558.457931] env[68571]: DEBUG nova.compute.manager [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] No waiting events found dispatching network-vif-plugged-627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 558.458020] env[68571]: WARNING nova.compute.manager [req-87918275-c83f-4bd9-81f2-a3dcd57bda0c req-1b11d262-d093-4d47-8cc4-1dba5f3a061e service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Received unexpected event network-vif-plugged-627e2aef-0903-47cd-8216-b5373437cb33 for instance with vm_state building and task_state spawning. [ 558.466140] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Updating instance_info_cache with network_info: [{"id": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "address": "fa:16:3e:5d:9f:8e", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f023740-f8", "ovs_interfaceid": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 558.489232] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 558.489534] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Instance network_info: |[{"id": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "address": "fa:16:3e:5d:9f:8e", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f023740-f8", "ovs_interfaceid": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 558.489920] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5d:9f:8e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5f023740-f8c1-4ca9-85c4-fae9770d09c2', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 558.501904] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating folder: Project (077f1a9875da491ab41f825a6faab831). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.502603] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e51ce34b-bb0c-4df0-b70d-5320e177a9f6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.516426] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created folder: Project (077f1a9875da491ab41f825a6faab831) in parent group-v692787. [ 558.516426] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating folder: Instances. Parent ref: group-v692800. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 558.516595] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b55140bf-932a-44a5-ac1c-b15901c6477b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.526849] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created folder: Instances in parent group-v692800. [ 558.527098] env[68571]: DEBUG oslo.service.loopingcall [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 558.527286] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 558.527537] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9ff31c03-1b87-420f-bfcf-7509ef8e8e43 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.551643] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 558.551643] env[68571]: value = "task-3467563" [ 558.551643] env[68571]: _type = "Task" [ 558.551643] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 558.568196] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467563, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 558.686761] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Successfully updated port: b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 558.695983] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 558.696302] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquired lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 558.696555] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 558.746786] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.779197] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 558.804074] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467560, 'name': CreateVM_Task, 'duration_secs': 0.305356} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 558.805085] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 558.805085] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 558.805085] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 558.805369] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 558.805369] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-df02f1fc-404c-4366-96f7-5c726e6850d3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 558.819365] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for the task: (returnval){ [ 558.819365] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5241e965-a7a4-f423-75af-0a6601be7808" [ 558.819365] env[68571]: _type = "Task" [ 558.819365] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 558.826605] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5241e965-a7a4-f423-75af-0a6601be7808, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.065671] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467563, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.136447] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Successfully created port: 1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 559.326540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.326885] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 559.327013] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.475597] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Updating instance_info_cache with network_info: [{"id": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "address": "fa:16:3e:f7:87:2c", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb622f4e9-77", "ovs_interfaceid": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 559.495937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Releasing lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.496273] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance network_info: |[{"id": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "address": "fa:16:3e:f7:87:2c", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb622f4e9-77", "ovs_interfaceid": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 559.496716] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:f7:87:2c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b622f4e9-7794-4a33-9e4c-2b57102c8116', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 559.507328] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Creating folder: Project (22d74ff9ceb5495e8524af57fb9ae473). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.508009] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a8783b40-c53d-45e0-82c9-051037580ead {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.519094] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Created folder: Project (22d74ff9ceb5495e8524af57fb9ae473) in parent group-v692787. [ 559.519094] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Creating folder: Instances. Parent ref: group-v692803. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.519094] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7f74950f-f449-4763-bef6-0d5de80f5057 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.527061] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Updating instance_info_cache with network_info: [{"id": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "address": "fa:16:3e:67:b6:1d", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4bc75b7-49", "ovs_interfaceid": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 559.532773] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Created folder: Instances in parent group-v692803. [ 559.535151] env[68571]: DEBUG oslo.service.loopingcall [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 559.535151] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.535151] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6d498689-17f6-49bb-b16c-ca95f66633d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.552580] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.552876] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance network_info: |[{"id": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "address": "fa:16:3e:67:b6:1d", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4bc75b7-49", "ovs_interfaceid": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 559.553656] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:67:b6:1d', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f4bc75b7-4935-4c48-97eb-f421f6fe8ba2', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 559.562174] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating folder: Project (93bc72f3b9714240946b1295a142f5ee). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.563866] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e5c4788a-9739-43a7-81db-c236f8c1cbfa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.565578] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.565578] env[68571]: value = "task-3467567" [ 559.565578] env[68571]: _type = "Task" [ 559.565578] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.577374] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created folder: Project (93bc72f3b9714240946b1295a142f5ee) in parent group-v692787. [ 559.577374] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating folder: Instances. Parent ref: group-v692805. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 559.581273] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-055a9402-473f-4c18-a80b-e90f2f6f4098 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.583766] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467563, 'name': CreateVM_Task, 'duration_secs': 0.541033} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 559.584635] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 559.585735] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.585807] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 559.586947] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 559.587202] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-25ad29ae-f092-4035-b621-50795e966c2d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.597322] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467567, 'name': CreateVM_Task} progress is 15%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 559.597593] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created folder: Instances in parent group-v692805. [ 559.597898] env[68571]: DEBUG oslo.service.loopingcall [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 559.598357] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 559.598622] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e6176b18-a992-4c23-9d06-f9c3c1e06396 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 559.618590] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 559.618590] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e268b1-8d72-e254-7c7d-472d23a5c689" [ 559.618590] env[68571]: _type = "Task" [ 559.618590] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.623626] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 559.623626] env[68571]: value = "task-3467570" [ 559.623626] env[68571]: _type = "Task" [ 559.623626] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 559.631518] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 559.631714] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 559.631940] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 559.636597] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467570, 'name': CreateVM_Task} progress is 5%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 560.085620] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467567, 'name': CreateVM_Task, 'duration_secs': 0.391914} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 560.086343] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 560.089881] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.089881] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 560.089881] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 560.089881] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7e14a0cb-fe86-4175-b293-a0cb3d5daeb1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.093744] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for the task: (returnval){ [ 560.093744] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5253912a-8bcb-ed60-3999-4b1a7aeb1b33" [ 560.093744] env[68571]: _type = "Task" [ 560.093744] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 560.102487] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5253912a-8bcb-ed60-3999-4b1a7aeb1b33, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 560.136887] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467570, 'name': CreateVM_Task, 'duration_secs': 0.424274} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 560.137528] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 560.138323] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.609099] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 560.609391] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 560.610310] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.610310] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 560.610310] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 560.610310] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c04e165c-e137-4f07-b108-c6b4b0bdefec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 560.617000] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 560.617000] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52a8cda2-521b-323e-d880-ae80032169cc" [ 560.617000] env[68571]: _type = "Task" [ 560.617000] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 560.629322] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52a8cda2-521b-323e-d880-ae80032169cc, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 560.759193] env[68571]: DEBUG nova.compute.manager [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Received event network-vif-plugged-f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 560.759193] env[68571]: DEBUG oslo_concurrency.lockutils [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] Acquiring lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 560.759193] env[68571]: DEBUG oslo_concurrency.lockutils [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 560.759193] env[68571]: DEBUG oslo_concurrency.lockutils [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 560.759457] env[68571]: DEBUG nova.compute.manager [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] No waiting events found dispatching network-vif-plugged-f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 560.759457] env[68571]: WARNING nova.compute.manager [req-1a1e06b3-2261-4e5d-8963-5e419b092fbe req-3d8786bb-af16-44eb-9b83-75e5d108feab service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Received unexpected event network-vif-plugged-f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 for instance with vm_state building and task_state spawning. [ 560.943796] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Successfully updated port: 1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 560.966319] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 560.966911] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 560.968459] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 561.058121] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 561.133032] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 561.133032] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 561.133361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 561.303150] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.303624] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.314689] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 561.398057] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 561.398658] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 561.400850] env[68571]: INFO nova.compute.claims [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 561.446111] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Updating instance_info_cache with network_info: [{"id": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "address": "fa:16:3e:01:6f:89", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1190541d-54", "ovs_interfaceid": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 561.475781] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 561.477436] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Instance network_info: |[{"id": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "address": "fa:16:3e:01:6f:89", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1190541d-54", "ovs_interfaceid": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 561.483142] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:01:6f:89', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3c995e9-7f2f-420c-880a-d60da6e708ad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1190541d-54f9-4a7c-9bc5-3b0b4251f5cb', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 561.495472] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating folder: Project (08c02345327245a99a5bb11408b51c6d). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.496060] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e4180aae-69bd-4f04-ab86-48cc22576ac9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.507705] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created folder: Project (08c02345327245a99a5bb11408b51c6d) in parent group-v692787. [ 561.507865] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating folder: Instances. Parent ref: group-v692810. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 561.510951] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-546d63cc-9804-4890-a3df-727f52e37f5d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.521881] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created folder: Instances in parent group-v692810. [ 561.522161] env[68571]: DEBUG oslo.service.loopingcall [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 561.522364] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 561.524184] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-aac1463e-189a-405f-a1e4-555288d2eed1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.551096] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 561.551096] env[68571]: value = "task-3467574" [ 561.551096] env[68571]: _type = "Task" [ 561.551096] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 561.561662] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467574, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 561.664896] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a46c99-2eed-47ff-a3aa-13b0d6cd58f1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.674628] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-79413458-7328-42d9-aad5-43e937ed35a5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.705650] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c836152a-4280-4aa9-9e76-87d92d110e1e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.714376] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70c0208b-f951-4264-9861-29503a7995db {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.729275] env[68571]: DEBUG nova.compute.provider_tree [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 561.738961] env[68571]: DEBUG nova.scheduler.client.report [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 561.759812] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.361s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 561.760588] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 561.806414] env[68571]: DEBUG nova.compute.utils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 561.807877] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 561.812025] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 561.825489] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 561.934482] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 561.981173] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 561.981173] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 561.981332] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 561.981716] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 561.982048] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 561.982349] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 561.982674] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 561.982957] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 561.983484] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 561.983802] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 561.984391] env[68571]: DEBUG nova.virt.hardware [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 561.986024] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-abcbb85e-81bb-419d-b9bc-b6cff7618b35 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 561.992823] env[68571]: DEBUG nova.policy [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f487141eae96491480b072daf61453d2', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '08c02345327245a99a5bb11408b51c6d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 562.000674] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab7ce349-2f46-498a-b72b-43c3f194ec0b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.062936] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467574, 'name': CreateVM_Task, 'duration_secs': 0.338294} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 562.065771] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 562.065771] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 562.065771] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 562.065771] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 562.065771] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7fdd356f-5cd3-43b0-8c3e-1c1a95ef20c7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.069786] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 562.069786] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5292b54b-91d8-d09f-12eb-adf59c6cd53c" [ 562.069786] env[68571]: _type = "Task" [ 562.069786] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 562.081960] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5292b54b-91d8-d09f-12eb-adf59c6cd53c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 562.299469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.299588] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.316453] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 562.382817] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 562.383088] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 562.386924] env[68571]: INFO nova.compute.claims [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 562.582111] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 562.582111] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 562.582322] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 562.589132] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Successfully created port: 3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 562.653035] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-574cc35f-87d2-4832-92b3-f7dbe0714619 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.661939] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-490ae05e-47b1-4de4-aec4-4be339ef8387 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.697370] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-014d5700-99dc-4505-bff3-7c890f45d2d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.705901] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f56e89e-261e-4a4a-94af-b0255b8b1c65 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 562.720510] env[68571]: DEBUG nova.compute.provider_tree [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 562.737346] env[68571]: DEBUG nova.scheduler.client.report [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 562.757626] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 562.757626] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 562.807546] env[68571]: DEBUG nova.compute.utils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 562.814927] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 562.815183] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 562.823100] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Received event network-changed-627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 562.823361] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Refreshing instance network info cache due to event network-changed-627e2aef-0903-47cd-8216-b5373437cb33. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 562.823765] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquiring lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 562.823765] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquired lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 562.824237] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Refreshing network info cache for port 627e2aef-0903-47cd-8216-b5373437cb33 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 562.854442] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 562.983216] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 563.022594] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 563.022853] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 563.023037] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 563.023221] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 563.023364] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 563.023507] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 563.023702] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 563.023856] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 563.024208] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 563.024478] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 563.024680] env[68571]: DEBUG nova.virt.hardware [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 563.025935] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea3f6759-cfc7-4025-958f-763f0dc6c7bb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.040018] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46130495-996a-4497-87c6-793919a3ed62 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 563.452702] env[68571]: DEBUG nova.policy [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a17aec0498f74f299cfda276df67edf8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c1730599cca4abeb755b2710e854059', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 563.763723] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Updated VIF entry in instance network info cache for port 627e2aef-0903-47cd-8216-b5373437cb33. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 563.764121] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Updating instance_info_cache with network_info: [{"id": "627e2aef-0903-47cd-8216-b5373437cb33", "address": "fa:16:3e:a9:a9:93", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.70", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap627e2aef-09", "ovs_interfaceid": "627e2aef-0903-47cd-8216-b5373437cb33", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 563.775198] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Releasing lock "refresh_cache-c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 563.775198] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Received event network-vif-plugged-5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 563.775198] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquiring lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 563.775397] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 563.775754] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 563.776245] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] No waiting events found dispatching network-vif-plugged-5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 563.776245] env[68571]: WARNING nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Received unexpected event network-vif-plugged-5f023740-f8c1-4ca9-85c4-fae9770d09c2 for instance with vm_state building and task_state spawning. [ 563.776245] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Received event network-changed-5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 563.776390] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Refreshing instance network info cache due to event network-changed-5f023740-f8c1-4ca9-85c4-fae9770d09c2. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 563.776640] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquiring lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 563.778735] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquired lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 563.778735] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Refreshing network info cache for port 5f023740-f8c1-4ca9-85c4-fae9770d09c2 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 564.385183] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Updated VIF entry in instance network info cache for port 5f023740-f8c1-4ca9-85c4-fae9770d09c2. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 564.385537] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Updating instance_info_cache with network_info: [{"id": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "address": "fa:16:3e:5d:9f:8e", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.138", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5f023740-f8", "ovs_interfaceid": "5f023740-f8c1-4ca9-85c4-fae9770d09c2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 564.403402] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Releasing lock "refresh_cache-f3b237f4-6e23-4474-b841-aa3ca8c1486f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 564.404835] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Received event network-vif-plugged-b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 564.404835] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquiring lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 564.404835] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 564.404835] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 564.405176] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] No waiting events found dispatching network-vif-plugged-b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 564.405176] env[68571]: WARNING nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Received unexpected event network-vif-plugged-b622f4e9-7794-4a33-9e4c-2b57102c8116 for instance with vm_state building and task_state spawning. [ 564.405176] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Received event network-changed-b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 564.405176] env[68571]: DEBUG nova.compute.manager [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Refreshing instance network info cache due to event network-changed-b622f4e9-7794-4a33-9e4c-2b57102c8116. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 564.405394] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquiring lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.405483] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Acquired lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.405650] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Refreshing network info cache for port b622f4e9-7794-4a33-9e4c-2b57102c8116 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 564.741479] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Successfully updated port: 3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 564.756738] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 564.758250] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 564.758250] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 564.937291] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Successfully created port: 35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 565.000558] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 565.150868] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Updated VIF entry in instance network info cache for port b622f4e9-7794-4a33-9e4c-2b57102c8116. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 565.150868] env[68571]: DEBUG nova.network.neutron [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Updating instance_info_cache with network_info: [{"id": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "address": "fa:16:3e:f7:87:2c", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.194", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb622f4e9-77", "ovs_interfaceid": "b622f4e9-7794-4a33-9e4c-2b57102c8116", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.164593] env[68571]: DEBUG oslo_concurrency.lockutils [req-920d171f-5538-4ab9-9218-d09165b11b5e req-e2685402-9cb0-44cb-8645-abd64739b9e2 service nova] Releasing lock "refresh_cache-e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 565.355826] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Updating instance_info_cache with network_info: [{"id": "3c41d773-b922-4c36-8868-03cd1cc0a534", "address": "fa:16:3e:03:ee:b1", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3c41d773-b9", "ovs_interfaceid": "3c41d773-b922-4c36-8868-03cd1cc0a534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 565.375782] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 565.376358] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance network_info: |[{"id": "3c41d773-b922-4c36-8868-03cd1cc0a534", "address": "fa:16:3e:03:ee:b1", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3c41d773-b9", "ovs_interfaceid": "3c41d773-b922-4c36-8868-03cd1cc0a534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 565.378100] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:03:ee:b1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3c995e9-7f2f-420c-880a-d60da6e708ad', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3c41d773-b922-4c36-8868-03cd1cc0a534', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 565.390236] env[68571]: DEBUG oslo.service.loopingcall [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 565.391734] env[68571]: DEBUG nova.compute.manager [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Received event network-changed-f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 565.391932] env[68571]: DEBUG nova.compute.manager [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Refreshing instance network info cache due to event network-changed-f4bc75b7-4935-4c48-97eb-f421f6fe8ba2. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 565.392311] env[68571]: DEBUG oslo_concurrency.lockutils [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] Acquiring lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 565.392459] env[68571]: DEBUG oslo_concurrency.lockutils [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] Acquired lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 565.392619] env[68571]: DEBUG nova.network.neutron [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Refreshing network info cache for port f4bc75b7-4935-4c48-97eb-f421f6fe8ba2 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 565.395622] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 565.396078] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8b7ebae8-1095-4f2b-aa78-ffa9b077042f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 565.424138] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 565.424138] env[68571]: value = "task-3467577" [ 565.424138] env[68571]: _type = "Task" [ 565.424138] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 565.432040] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467577, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 565.655085] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.655381] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.678674] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 565.745517] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 565.745517] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 565.748575] env[68571]: INFO nova.compute.claims [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 566.477751] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467577, 'name': CreateVM_Task, 'duration_secs': 0.352481} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 566.481183] env[68571]: DEBUG nova.network.neutron [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Updated VIF entry in instance network info cache for port f4bc75b7-4935-4c48-97eb-f421f6fe8ba2. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 566.481498] env[68571]: DEBUG nova.network.neutron [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Updating instance_info_cache with network_info: [{"id": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "address": "fa:16:3e:67:b6:1d", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.108", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4bc75b7-49", "ovs_interfaceid": "f4bc75b7-4935-4c48-97eb-f421f6fe8ba2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 566.482543] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 566.483538] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 566.483870] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 566.484037] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 566.484293] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-81aacc99-8f29-4cca-8588-cf7593f2603e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.492760] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 566.492760] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]526236a9-3562-ae29-5e74-1820d4f2d76c" [ 566.492760] env[68571]: _type = "Task" [ 566.492760] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 566.497792] env[68571]: DEBUG oslo_concurrency.lockutils [req-96f76c56-0058-413c-a0c6-9a9ed56a909a req-d87475ac-bf17-4ee9-9a09-bdc84addaad9 service nova] Releasing lock "refresh_cache-0eae5e9a-258a-44e5-9b4f-53100f15aa7a" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 566.501829] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]526236a9-3562-ae29-5e74-1820d4f2d76c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 566.554344] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f36be412-40e4-43f9-87ec-c09ec78ab457 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.562373] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a41b66d-c62b-4b27-80da-c12d023bd3b0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.595097] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ca523e1c-dd05-4cd1-a0b6-4e919673d3a5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.603763] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-206abec3-bbf9-4d34-b110-9977a3f69f9d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.617407] env[68571]: DEBUG nova.compute.provider_tree [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 566.627697] env[68571]: DEBUG nova.scheduler.client.report [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 566.649044] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.904s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 566.649557] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 566.701038] env[68571]: DEBUG nova.compute.utils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 566.702129] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 566.702318] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 566.723355] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 566.784848] env[68571]: DEBUG nova.policy [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd34e5361b36c4dc5824b0f42a37e6bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '290427ab03f446ce9297ea393c083ff9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 566.812360] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 566.856097] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 566.856490] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 566.856490] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 566.858328] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 566.858328] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 566.858328] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 566.858328] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 566.858707] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 566.858707] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 566.858801] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 566.859845] env[68571]: DEBUG nova.virt.hardware [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 566.861160] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-559fdb0b-4764-46d2-b026-cd27b5da3eec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 566.871732] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00f42b42-6c40-41b3-8676-ad8cb43acf1e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 567.003695] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 567.003902] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 567.004138] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.418751] env[68571]: DEBUG nova.compute.manager [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Received event network-vif-plugged-3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 567.418979] env[68571]: DEBUG oslo_concurrency.lockutils [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] Acquiring lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 567.419190] env[68571]: DEBUG oslo_concurrency.lockutils [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 567.419355] env[68571]: DEBUG oslo_concurrency.lockutils [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 567.419516] env[68571]: DEBUG nova.compute.manager [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] No waiting events found dispatching network-vif-plugged-3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 567.419664] env[68571]: WARNING nova.compute.manager [req-d8a5f77d-6eb2-48a9-9429-ac798278547e req-a10123ee-63d3-4c7f-97af-df03ef183409 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Received unexpected event network-vif-plugged-3c41d773-b922-4c36-8868-03cd1cc0a534 for instance with vm_state building and task_state spawning. [ 567.543115] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Successfully created port: a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 567.562487] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Successfully updated port: 35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 567.578652] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 567.578997] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquired lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 567.578997] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 567.670208] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 568.071285] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Received event network-vif-plugged-1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 568.071633] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Acquiring lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 568.071943] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 568.072222] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 568.072481] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] No waiting events found dispatching network-vif-plugged-1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 568.072768] env[68571]: WARNING nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Received unexpected event network-vif-plugged-1190541d-54f9-4a7c-9bc5-3b0b4251f5cb for instance with vm_state building and task_state spawning. [ 568.074068] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Received event network-changed-1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 568.074068] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Refreshing instance network info cache due to event network-changed-1190541d-54f9-4a7c-9bc5-3b0b4251f5cb. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 568.074068] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Acquiring lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 568.074068] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Acquired lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 568.081166] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Refreshing network info cache for port 1190541d-54f9-4a7c-9bc5-3b0b4251f5cb {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 568.204245] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Updating instance_info_cache with network_info: [{"id": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "address": "fa:16:3e:e6:63:40", "network": {"id": "2a8924a7-6ad6-42e0-8092-758533aa8e53", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-498562246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c1730599cca4abeb755b2710e854059", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c24464bb-bb6b-43a2-bdcd-8086ad1a307f", "external-id": "nsx-vlan-transportzone-781", "segmentation_id": 781, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35462b23-1d", "ovs_interfaceid": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 568.216894] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Releasing lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 568.217282] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance network_info: |[{"id": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "address": "fa:16:3e:e6:63:40", "network": {"id": "2a8924a7-6ad6-42e0-8092-758533aa8e53", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-498562246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c1730599cca4abeb755b2710e854059", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c24464bb-bb6b-43a2-bdcd-8086ad1a307f", "external-id": "nsx-vlan-transportzone-781", "segmentation_id": 781, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35462b23-1d", "ovs_interfaceid": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 568.219404] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e6:63:40', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c24464bb-bb6b-43a2-bdcd-8086ad1a307f', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '35462b23-1d6d-4333-ab0e-e7aa78b8bb6b', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 568.230451] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Creating folder: Project (8c1730599cca4abeb755b2710e854059). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 568.231243] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-11719035-6e26-4687-8c57-7fadf1f64073 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.244273] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Created folder: Project (8c1730599cca4abeb755b2710e854059) in parent group-v692787. [ 568.244422] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Creating folder: Instances. Parent ref: group-v692814. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 568.244941] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b52bd82b-3ab1-4119-9142-1180298f2615 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.257013] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Created folder: Instances in parent group-v692814. [ 568.257193] env[68571]: DEBUG oslo.service.loopingcall [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 568.257375] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 568.257595] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0edf562b-71ab-4616-b78d-bc1f05263e6d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.285432] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 568.285432] env[68571]: value = "task-3467581" [ 568.285432] env[68571]: _type = "Task" [ 568.285432] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 568.293914] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467581, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 568.797509] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467581, 'name': CreateVM_Task, 'duration_secs': 0.365833} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 568.797715] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 568.798407] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 568.798576] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 568.798959] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 568.799157] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7e9710ef-20fd-45b4-a97e-5a6bb88623b8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 568.806326] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for the task: (returnval){ [ 568.806326] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5285af2c-443b-8d5a-c22b-009e2227e4fe" [ 568.806326] env[68571]: _type = "Task" [ 568.806326] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 568.816400] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5285af2c-443b-8d5a-c22b-009e2227e4fe, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 568.829177] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Updated VIF entry in instance network info cache for port 1190541d-54f9-4a7c-9bc5-3b0b4251f5cb. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 568.829539] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Updating instance_info_cache with network_info: [{"id": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "address": "fa:16:3e:01:6f:89", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1190541d-54", "ovs_interfaceid": "1190541d-54f9-4a7c-9bc5-3b0b4251f5cb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 568.845958] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Releasing lock "refresh_cache-349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 568.846572] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Received event network-changed-3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 568.846572] env[68571]: DEBUG nova.compute.manager [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Refreshing instance network info cache due to event network-changed-3c41d773-b922-4c36-8868-03cd1cc0a534. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 568.846716] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Acquiring lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 568.846881] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Acquired lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 568.847044] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Refreshing network info cache for port 3c41d773-b922-4c36-8868-03cd1cc0a534 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 569.320930] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 569.320930] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 569.320930] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 569.470671] env[68571]: DEBUG nova.compute.manager [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Received event network-vif-plugged-35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 569.471085] env[68571]: DEBUG oslo_concurrency.lockutils [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] Acquiring lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.471425] env[68571]: DEBUG oslo_concurrency.lockutils [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.471425] env[68571]: DEBUG oslo_concurrency.lockutils [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 569.471582] env[68571]: DEBUG nova.compute.manager [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] No waiting events found dispatching network-vif-plugged-35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 569.472264] env[68571]: WARNING nova.compute.manager [req-cd2d9904-1051-49cc-904b-1c5216d8b90f req-b33aab62-fb86-4574-9efc-393c2625065a service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Received unexpected event network-vif-plugged-35462b23-1d6d-4333-ab0e-e7aa78b8bb6b for instance with vm_state building and task_state spawning. [ 569.501160] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.501496] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.501681] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 569.501801] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 569.527112] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527273] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527400] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527521] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527637] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527752] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527868] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.527980] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.528103] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.528216] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 569.528334] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 569.528843] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.529098] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.529300] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.529483] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.529674] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.529857] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.531054] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 569.531500] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 569.548400] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.548618] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.548781] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 569.548933] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 569.550117] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97795bc0-bd1d-4489-8904-e6f480810fdd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.565047] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4afac4dc-7611-4d34-87d6-2c6e728f93da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.588502] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-108729a9-6380-40f2-9501-fedc9bd886f6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.595744] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2fa4252f-cd1f-43f3-9272-dd304e677ff6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.630902] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180941MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 569.631190] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 569.631387] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 569.654326] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Successfully updated port: a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 569.673620] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 569.673780] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 569.673926] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 569.724814] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725336] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725336] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f3b237f4-6e23-4474-b841-aa3ca8c1486f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725336] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725470] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4c5c97bc-4a9f-413b-a75f-a197270103a2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725470] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725718] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8c30562a-4a81-4007-923c-3bc0b922f01c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725799] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.725968] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.726111] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 569.726626] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 569.726626] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 569.752979] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 569.758449] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Updated VIF entry in instance network info cache for port 3c41d773-b922-4c36-8868-03cd1cc0a534. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 569.759454] env[68571]: DEBUG nova.network.neutron [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Updating instance_info_cache with network_info: [{"id": "3c41d773-b922-4c36-8868-03cd1cc0a534", "address": "fa:16:3e:03:ee:b1", "network": {"id": "f609005f-44c4-41e6-8e8f-4af16581eb6d", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-680917518-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "08c02345327245a99a5bb11408b51c6d", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3c995e9-7f2f-420c-880a-d60da6e708ad", "external-id": "nsx-vlan-transportzone-166", "segmentation_id": 166, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3c41d773-b9", "ovs_interfaceid": "3c41d773-b922-4c36-8868-03cd1cc0a534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 569.775187] env[68571]: DEBUG oslo_concurrency.lockutils [req-9d3dd710-46ce-4917-8830-788418d7adb4 req-03007bda-6cab-4a69-b682-a65996eeb9b9 service nova] Releasing lock "refresh_cache-15eb6744-4b26-4d7a-8639-cb3bd13e3726" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 569.924498] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e3a32ab-270d-4f46-9fc1-72cce9c28d77 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.931995] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0752f99a-b774-459a-a01c-5564ebaee4a3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.963635] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd55dd0-4a46-4ed7-8dac-9ed133ffdd04 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.973581] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e5b9b8b-45a0-4d27-a07a-a6bfa75cf0ff {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 569.988214] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 569.998749] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 570.018053] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 570.018313] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.387s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 570.118586] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Updating instance_info_cache with network_info: [{"id": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "address": "fa:16:3e:31:56:37", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa20b39ba-61", "ovs_interfaceid": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 570.137269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 570.137693] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance network_info: |[{"id": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "address": "fa:16:3e:31:56:37", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa20b39ba-61", "ovs_interfaceid": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 570.138590] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:31:56:37', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a20b39ba-613f-43dc-ae91-a19d2488eef7', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 570.148038] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating folder: Project (290427ab03f446ce9297ea393c083ff9). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.148038] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-131490ea-77a7-497f-bd25-871e266b91fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.161645] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created folder: Project (290427ab03f446ce9297ea393c083ff9) in parent group-v692787. [ 570.161843] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating folder: Instances. Parent ref: group-v692817. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 570.162092] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f8a6dab-906a-45d3-8f53-007d6df0ff6e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.172851] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created folder: Instances in parent group-v692817. [ 570.173866] env[68571]: DEBUG oslo.service.loopingcall [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 570.173866] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 570.174090] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ca2704fa-54d1-4f2d-9e6a-114310e96623 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 570.198331] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 570.198331] env[68571]: value = "task-3467584" [ 570.198331] env[68571]: _type = "Task" [ 570.198331] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 570.207317] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467584, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 570.454197] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 570.454197] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 570.713775] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467584, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 571.213330] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467584, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 571.296033] env[68571]: DEBUG nova.compute.manager [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Received event network-vif-plugged-a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 571.296347] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Acquiring lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.296554] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.296735] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 571.296898] env[68571]: DEBUG nova.compute.manager [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] No waiting events found dispatching network-vif-plugged-a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 571.297068] env[68571]: WARNING nova.compute.manager [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Received unexpected event network-vif-plugged-a20b39ba-613f-43dc-ae91-a19d2488eef7 for instance with vm_state building and task_state spawning. [ 571.297244] env[68571]: DEBUG nova.compute.manager [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Received event network-changed-a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 571.297632] env[68571]: DEBUG nova.compute.manager [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Refreshing instance network info cache due to event network-changed-a20b39ba-613f-43dc-ae91-a19d2488eef7. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 571.297632] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Acquiring lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 571.297722] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Acquired lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 571.297900] env[68571]: DEBUG nova.network.neutron [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Refreshing network info cache for port a20b39ba-613f-43dc-ae91-a19d2488eef7 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 571.539942] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 571.540292] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 571.711644] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467584, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.040996] env[68571]: DEBUG nova.network.neutron [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Updated VIF entry in instance network info cache for port a20b39ba-613f-43dc-ae91-a19d2488eef7. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 572.042188] env[68571]: DEBUG nova.network.neutron [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Updating instance_info_cache with network_info: [{"id": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "address": "fa:16:3e:31:56:37", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa20b39ba-61", "ovs_interfaceid": "a20b39ba-613f-43dc-ae91-a19d2488eef7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 572.054479] env[68571]: DEBUG oslo_concurrency.lockutils [req-eb6406fa-8f3b-47d3-947b-88082cc46e98 req-0f0346fe-a0b5-4b6f-801e-8423263a04b6 service nova] Releasing lock "refresh_cache-cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 572.213320] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467584, 'name': CreateVM_Task, 'duration_secs': 1.527119} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 572.213493] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 572.214760] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.214760] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.214904] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 572.216029] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-302c3081-0421-4c30-8a02-6025da30a9f6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 572.220015] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 572.220015] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]520c7acd-d221-1cf6-3238-6bb9eb777020" [ 572.220015] env[68571]: _type = "Task" [ 572.220015] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 572.229710] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]520c7acd-d221-1cf6-3238-6bb9eb777020, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 572.465912] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 572.466120] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 572.714938] env[68571]: DEBUG nova.compute.manager [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Received event network-changed-35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 572.715204] env[68571]: DEBUG nova.compute.manager [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Refreshing instance network info cache due to event network-changed-35462b23-1d6d-4333-ab0e-e7aa78b8bb6b. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 572.716074] env[68571]: DEBUG oslo_concurrency.lockutils [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] Acquiring lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 572.716287] env[68571]: DEBUG oslo_concurrency.lockutils [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] Acquired lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 572.716463] env[68571]: DEBUG nova.network.neutron [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Refreshing network info cache for port 35462b23-1d6d-4333-ab0e-e7aa78b8bb6b {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 572.735861] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 572.736488] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 572.736488] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 573.515222] env[68571]: DEBUG nova.network.neutron [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Updated VIF entry in instance network info cache for port 35462b23-1d6d-4333-ab0e-e7aa78b8bb6b. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 573.515222] env[68571]: DEBUG nova.network.neutron [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Updating instance_info_cache with network_info: [{"id": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "address": "fa:16:3e:e6:63:40", "network": {"id": "2a8924a7-6ad6-42e0-8092-758533aa8e53", "bridge": "br-int", "label": "tempest-SecurityGroupsTestJSON-498562246-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8c1730599cca4abeb755b2710e854059", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c24464bb-bb6b-43a2-bdcd-8086ad1a307f", "external-id": "nsx-vlan-transportzone-781", "segmentation_id": 781, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap35462b23-1d", "ovs_interfaceid": "35462b23-1d6d-4333-ab0e-e7aa78b8bb6b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 573.535154] env[68571]: DEBUG oslo_concurrency.lockutils [req-76818eac-4045-4205-8f6e-c2faa762ccbb req-f7de5c5c-3d5a-4c57-afed-4621bfb57682 service nova] Releasing lock "refresh_cache-e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 573.718934] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 573.719293] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 574.709858] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 574.710555] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 576.287214] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6153af01-7318-4b3f-a3ac-3e24e98334da tempest-ServersWithSpecificFlavorTestJSON-2116283528 tempest-ServersWithSpecificFlavorTestJSON-2116283528-project-member] Acquiring lock "0be1ddd3-e07f-49b3-a5a7-df32b5262c30" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 576.287796] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6153af01-7318-4b3f-a3ac-3e24e98334da tempest-ServersWithSpecificFlavorTestJSON-2116283528 tempest-ServersWithSpecificFlavorTestJSON-2116283528-project-member] Lock "0be1ddd3-e07f-49b3-a5a7-df32b5262c30" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 584.421539] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f2c55328-3980-4dbe-bf30-b32cc1b2dd1c tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Acquiring lock "6894c90c-cbfb-4226-a0b5-e195f923c8e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 584.421799] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f2c55328-3980-4dbe-bf30-b32cc1b2dd1c tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "6894c90c-cbfb-4226-a0b5-e195f923c8e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 587.423157] env[68571]: DEBUG oslo_concurrency.lockutils [None req-37114a01-3f73-48b5-b0ca-ae97ed1a5b26 tempest-ServersV294TestFqdnHostnames-789885078 tempest-ServersV294TestFqdnHostnames-789885078-project-member] Acquiring lock "7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 587.423948] env[68571]: DEBUG oslo_concurrency.lockutils [None req-37114a01-3f73-48b5-b0ca-ae97ed1a5b26 tempest-ServersV294TestFqdnHostnames-789885078 tempest-ServersV294TestFqdnHostnames-789885078-project-member] Lock "7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 587.601917] env[68571]: DEBUG oslo_concurrency.lockutils [None req-10a1a04a-478e-409f-97c4-00b1e9598ad3 tempest-TenantUsagesTestJSON-1278198306 tempest-TenantUsagesTestJSON-1278198306-project-member] Acquiring lock "52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 587.602258] env[68571]: DEBUG oslo_concurrency.lockutils [None req-10a1a04a-478e-409f-97c4-00b1e9598ad3 tempest-TenantUsagesTestJSON-1278198306 tempest-TenantUsagesTestJSON-1278198306-project-member] Lock "52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.412665] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4d641e68-b197-4303-b234-1a916d8d0924 tempest-ServerAddressesTestJSON-1231908386 tempest-ServerAddressesTestJSON-1231908386-project-member] Acquiring lock "db77f64d-5b6c-4a88-aa1c-2622832b3f58" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 588.412888] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4d641e68-b197-4303-b234-1a916d8d0924 tempest-ServerAddressesTestJSON-1231908386 tempest-ServerAddressesTestJSON-1231908386-project-member] Lock "db77f64d-5b6c-4a88-aa1c-2622832b3f58" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 588.437158] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2374bcbf-15cc-48d0-af82-4deb47a498ce tempest-ImagesNegativeTestJSON-1202359250 tempest-ImagesNegativeTestJSON-1202359250-project-member] Acquiring lock "7a000e36-e100-4c79-a170-8cf86a4244d7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 588.441229] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2374bcbf-15cc-48d0-af82-4deb47a498ce tempest-ImagesNegativeTestJSON-1202359250 tempest-ImagesNegativeTestJSON-1202359250-project-member] Lock "7a000e36-e100-4c79-a170-8cf86a4244d7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.002s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 589.553054] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f5a754fd-49aa-4f93-9ea3-91bab45f7731 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "47ab9428-5860-4c42-a5ec-a9ff608790e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 589.553332] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f5a754fd-49aa-4f93-9ea3-91bab45f7731 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "47ab9428-5860-4c42-a5ec-a9ff608790e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 590.204042] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c19cbc55-cf90-40d9-9e3a-4cfd43138761 tempest-ImagesOneServerTestJSON-2111796249 tempest-ImagesOneServerTestJSON-2111796249-project-member] Acquiring lock "a91a0cd6-a014-43c7-8723-55825c0c8662" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 590.204685] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c19cbc55-cf90-40d9-9e3a-4cfd43138761 tempest-ImagesOneServerTestJSON-2111796249 tempest-ImagesOneServerTestJSON-2111796249-project-member] Lock "a91a0cd6-a014-43c7-8723-55825c0c8662" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 597.001088] env[68571]: DEBUG oslo_concurrency.lockutils [None req-60e62a2e-5b06-4d9f-962b-fc758ff5d907 tempest-ServerDiagnosticsTest-1371719933 tempest-ServerDiagnosticsTest-1371719933-project-member] Acquiring lock "a6eac04c-996e-4733-a37e-d1ba61762409" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 597.001860] env[68571]: DEBUG oslo_concurrency.lockutils [None req-60e62a2e-5b06-4d9f-962b-fc758ff5d907 tempest-ServerDiagnosticsTest-1371719933 tempest-ServerDiagnosticsTest-1371719933-project-member] Lock "a6eac04c-996e-4733-a37e-d1ba61762409" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 599.727072] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9db9e5d0-852f-4143-bea9-b8e6bca60028 tempest-ServerMetadataTestJSON-517122474 tempest-ServerMetadataTestJSON-517122474-project-member] Acquiring lock "ef53dbb8-20d3-4b5c-be29-ce75cc6c0233" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 599.727339] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9db9e5d0-852f-4143-bea9-b8e6bca60028 tempest-ServerMetadataTestJSON-517122474 tempest-ServerMetadataTestJSON-517122474-project-member] Lock "ef53dbb8-20d3-4b5c-be29-ce75cc6c0233" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 601.980939] env[68571]: WARNING oslo_vmware.rw_handles [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 601.980939] env[68571]: ERROR oslo_vmware.rw_handles [ 601.981631] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 601.983512] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 601.984322] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Copying Virtual Disk [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/1b2b5c84-386d-4739-bef6-6c90336e875e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 601.984322] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-18df2ab5-31da-4940-b94c-bbf80db9ec43 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 601.996069] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for the task: (returnval){ [ 601.996069] env[68571]: value = "task-3467585" [ 601.996069] env[68571]: _type = "Task" [ 601.996069] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.010120] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Task: {'id': task-3467585, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.508671] env[68571]: DEBUG oslo_vmware.exceptions [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 602.509045] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 602.516506] env[68571]: ERROR nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 602.516506] env[68571]: Faults: ['InvalidArgument'] [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Traceback (most recent call last): [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] yield resources [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self.driver.spawn(context, instance, image_meta, [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self._fetch_image_if_missing(context, vi) [ 602.516506] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] image_cache(vi, tmp_image_ds_loc) [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] vm_util.copy_virtual_disk( [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] session._wait_for_task(vmdk_copy_task) [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return self.wait_for_task(task_ref) [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return evt.wait() [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] result = hub.switch() [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 602.517992] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return self.greenlet.switch() [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self.f(*self.args, **self.kw) [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] raise exceptions.translate_fault(task_info.error) [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Faults: ['InvalidArgument'] [ 602.518526] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] [ 602.518526] env[68571]: INFO nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Terminating instance [ 602.518526] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.519289] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 602.519289] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5346049b-8a2f-4d7e-8106-d71fdc0da9ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.521127] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 602.521304] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquired lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 602.521591] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 602.529218] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 602.529410] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 602.531080] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1f61c998-f65b-4616-ac4f-672b15d80409 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.540351] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Waiting for the task: (returnval){ [ 602.540351] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]525a14c2-c819-a68c-75de-9f7c60eebd93" [ 602.540351] env[68571]: _type = "Task" [ 602.540351] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.551710] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]525a14c2-c819-a68c-75de-9f7c60eebd93, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 602.570145] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 602.767607] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 602.781910] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Releasing lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 602.781910] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 602.781910] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 602.782523] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29411be1-50f4-46e7-ad6c-0fee286435c0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.800186] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 602.800186] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-849f9a18-ded8-4de8-bf08-99b89a16b92c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.836614] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 602.836614] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 602.840302] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Deleting the datastore file [datastore1] 4c5c97bc-4a9f-413b-a75f-a197270103a2 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 602.840302] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b897fadd-22b2-4d60-a377-35e3f458a8e3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 602.843960] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for the task: (returnval){ [ 602.843960] env[68571]: value = "task-3467587" [ 602.843960] env[68571]: _type = "Task" [ 602.843960] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 602.857635] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Task: {'id': task-3467587, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 603.051946] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 603.052380] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Creating directory with path [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 603.052380] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-08a1bdf0-5627-4b56-b30d-7d2cd368d0c6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.065922] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Created directory with path [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 603.066165] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Fetch image to [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 603.066340] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 603.067190] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cd9c459-db75-4c49-bd02-d5bb62590072 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.077501] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e32b3ebe-42c6-4847-95d9-7fb92bb3d031 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.088233] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f0dc17e-cab6-4b00-9d64-ca8b5bf6fc9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.130866] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ef0379b-0553-4b3b-bf03-3936b96db71f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.137573] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c43016e6-4a48-4a17-a488-3928f38416a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.169835] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 603.248648] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 603.316135] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 603.316911] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 603.356221] env[68571]: DEBUG oslo_vmware.api [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Task: {'id': task-3467587, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.03279} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 603.356402] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 603.356711] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 603.356759] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 603.357240] env[68571]: INFO nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Took 0.58 seconds to destroy the instance on the hypervisor. [ 603.357494] env[68571]: DEBUG oslo.service.loopingcall [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 603.357734] env[68571]: DEBUG nova.compute.manager [-] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 603.360338] env[68571]: DEBUG nova.compute.claims [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 603.360507] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 603.360824] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 603.815220] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c0a08f4-e622-4ae3-8354-0d4d3b18104c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.822518] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f440d93b-91bf-4c2f-84c0-299242bd57fd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.854345] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85b81612-2e20-4ace-bfa3-24a710341eb2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.861870] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd6ede8e-8512-4b3d-a146-0ee308f2eda5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 603.875105] env[68571]: DEBUG nova.compute.provider_tree [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 603.884797] env[68571]: DEBUG nova.scheduler.client.report [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 603.902529] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.542s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 603.903139] env[68571]: ERROR nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.903139] env[68571]: Faults: ['InvalidArgument'] [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Traceback (most recent call last): [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self.driver.spawn(context, instance, image_meta, [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self._fetch_image_if_missing(context, vi) [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] image_cache(vi, tmp_image_ds_loc) [ 603.903139] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] vm_util.copy_virtual_disk( [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] session._wait_for_task(vmdk_copy_task) [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return self.wait_for_task(task_ref) [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return evt.wait() [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] result = hub.switch() [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] return self.greenlet.switch() [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 603.903533] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] self.f(*self.args, **self.kw) [ 603.903887] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 603.903887] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] raise exceptions.translate_fault(task_info.error) [ 603.903887] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 603.903887] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Faults: ['InvalidArgument'] [ 603.903887] env[68571]: ERROR nova.compute.manager [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] [ 603.903887] env[68571]: DEBUG nova.compute.utils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 603.906793] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Build of instance 4c5c97bc-4a9f-413b-a75f-a197270103a2 was re-scheduled: A specified parameter was not correct: fileType [ 603.906793] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 603.907295] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 603.907886] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquiring lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 603.907886] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Acquired lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 603.907886] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 603.950806] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 604.125105] env[68571]: DEBUG nova.network.neutron [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 604.138243] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Releasing lock "refresh_cache-4c5c97bc-4a9f-413b-a75f-a197270103a2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 604.138614] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 604.139056] env[68571]: DEBUG nova.compute.manager [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] [instance: 4c5c97bc-4a9f-413b-a75f-a197270103a2] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 604.276851] env[68571]: INFO nova.scheduler.client.report [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Deleted allocations for instance 4c5c97bc-4a9f-413b-a75f-a197270103a2 [ 604.322263] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a18440b7-1936-4dc5-990b-e5fd49c4456f tempest-ServerDiagnosticsV248Test-1249713863 tempest-ServerDiagnosticsV248Test-1249713863-project-member] Lock "4c5c97bc-4a9f-413b-a75f-a197270103a2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 49.567s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 604.383475] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 604.469018] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 604.469018] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 604.469468] env[68571]: INFO nova.compute.claims [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 604.614977] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 604.615361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 604.961032] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c695994-a8da-4837-9f02-b4f839d4090c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 604.967383] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7cdf6d33-a876-496e-a354-dd851e4e590c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.004030] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45c2a94-d811-44cc-b5aa-c691cd30d99b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.010324] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96487e8c-0fed-4a7d-af39-e25c912e1dfa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.026112] env[68571]: DEBUG nova.compute.provider_tree [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 605.038667] env[68571]: DEBUG nova.scheduler.client.report [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 605.059016] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.591s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 605.059588] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 605.108943] env[68571]: DEBUG nova.compute.utils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 605.111587] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 605.111587] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 605.130456] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 605.190171] env[68571]: DEBUG nova.policy [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f6208975c4494a9a99b125abce56c2f7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '35a7174bea8a49958c5ddc5b88634f5b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 605.226991] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 605.270491] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 605.270787] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 605.270987] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 605.271509] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 605.271509] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 605.271613] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 605.271928] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 605.272118] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 605.272343] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 605.272585] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 605.272807] env[68571]: DEBUG nova.virt.hardware [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 605.273781] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8fb69f9-7baa-4877-92c0-3b1e92853ca7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.282740] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f46efb61-2510-4898-a80e-fef8c1de2a9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 605.556139] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Successfully created port: c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 606.613144] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Successfully updated port: c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 606.648340] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 606.648340] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquired lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 606.648340] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 606.710488] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 606.818447] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8b703ec1-2793-4846-a6f0-e3fdce5e72cf tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Acquiring lock "78ce800c-1f8e-496e-9be2-24675657acb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 606.819098] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8b703ec1-2793-4846-a6f0-e3fdce5e72cf tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "78ce800c-1f8e-496e-9be2-24675657acb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 606.819098] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9a5e537f-c469-42f2-8f69-b716684f0e52 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Acquiring lock "0d78609e-cda0-4309-af6e-7d30a939443b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 606.819098] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9a5e537f-c469-42f2-8f69-b716684f0e52 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Lock "0d78609e-cda0-4309-af6e-7d30a939443b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 607.043499] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Updating instance_info_cache with network_info: [{"id": "c72a41da-cac2-4070-97ac-8e9263e34645", "address": "fa:16:3e:7f:95:97", "network": {"id": "11a683b2-602b-426c-b83e-8ae9788cd50f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1977383221-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35a7174bea8a49958c5ddc5b88634f5b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc72a41da-ca", "ovs_interfaceid": "c72a41da-cac2-4070-97ac-8e9263e34645", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 607.060023] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Releasing lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 607.060023] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance network_info: |[{"id": "c72a41da-cac2-4070-97ac-8e9263e34645", "address": "fa:16:3e:7f:95:97", "network": {"id": "11a683b2-602b-426c-b83e-8ae9788cd50f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1977383221-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35a7174bea8a49958c5ddc5b88634f5b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc72a41da-ca", "ovs_interfaceid": "c72a41da-cac2-4070-97ac-8e9263e34645", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 607.060942] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7f:95:97', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5ba27300-88df-4c95-b9e0-a4a8b5039c3c', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c72a41da-cac2-4070-97ac-8e9263e34645', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 607.075586] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Creating folder: Project (35a7174bea8a49958c5ddc5b88634f5b). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 607.076968] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-4e4ea36c-038a-4fa2-bc25-604bc50f9552 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 607.094066] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Created folder: Project (35a7174bea8a49958c5ddc5b88634f5b) in parent group-v692787. [ 607.094066] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Creating folder: Instances. Parent ref: group-v692820. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 607.094066] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-71f6d1cf-3fd6-4481-9ca6-f7c22192dde8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 607.101903] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Created folder: Instances in parent group-v692820. [ 607.102548] env[68571]: DEBUG oslo.service.loopingcall [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 607.102871] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 607.103211] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-92be1a70-37c8-4a24-a0e1-4c3098059246 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 607.128199] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 607.128199] env[68571]: value = "task-3467590" [ 607.128199] env[68571]: _type = "Task" [ 607.128199] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 607.136059] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467590, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 607.643206] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467590, 'name': CreateVM_Task, 'duration_secs': 0.28869} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 607.645448] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 607.645448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 607.645448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 607.645448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 607.645448] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a516e1e4-292f-47a1-bf4a-48a8651f5d82 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 607.650929] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for the task: (returnval){ [ 607.650929] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52daa50d-41bb-be5a-c1bd-aad2ca0d7080" [ 607.650929] env[68571]: _type = "Task" [ 607.650929] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 607.663142] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52daa50d-41bb-be5a-c1bd-aad2ca0d7080, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 608.163561] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 608.163771] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 608.164031] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 608.177179] env[68571]: DEBUG nova.compute.manager [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Received event network-vif-plugged-c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 608.177449] env[68571]: DEBUG oslo_concurrency.lockutils [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] Acquiring lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 608.177652] env[68571]: DEBUG oslo_concurrency.lockutils [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 608.177814] env[68571]: DEBUG oslo_concurrency.lockutils [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 608.178713] env[68571]: DEBUG nova.compute.manager [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] No waiting events found dispatching network-vif-plugged-c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 608.179102] env[68571]: WARNING nova.compute.manager [req-eac09527-421e-4f1c-b914-3c8f92e5ff41 req-381db69a-6a50-47db-a6ae-612af32b3db9 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Received unexpected event network-vif-plugged-c72a41da-cac2-4070-97ac-8e9263e34645 for instance with vm_state building and task_state spawning. [ 609.621103] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d06bdd6f-c5c0-4aaf-b5a9-d7a41ea67cb8 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Acquiring lock "1ed21e6d-6b5a-4e6e-9466-b5beceda09e1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 609.621403] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d06bdd6f-c5c0-4aaf-b5a9-d7a41ea67cb8 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Lock "1ed21e6d-6b5a-4e6e-9466-b5beceda09e1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 610.742616] env[68571]: DEBUG nova.compute.manager [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Received event network-changed-c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 610.744092] env[68571]: DEBUG nova.compute.manager [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Refreshing instance network info cache due to event network-changed-c72a41da-cac2-4070-97ac-8e9263e34645. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 610.744092] env[68571]: DEBUG oslo_concurrency.lockutils [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] Acquiring lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 610.744092] env[68571]: DEBUG oslo_concurrency.lockutils [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] Acquired lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 610.744092] env[68571]: DEBUG nova.network.neutron [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Refreshing network info cache for port c72a41da-cac2-4070-97ac-8e9263e34645 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 611.017120] env[68571]: DEBUG nova.network.neutron [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Updated VIF entry in instance network info cache for port c72a41da-cac2-4070-97ac-8e9263e34645. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 611.017828] env[68571]: DEBUG nova.network.neutron [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Updating instance_info_cache with network_info: [{"id": "c72a41da-cac2-4070-97ac-8e9263e34645", "address": "fa:16:3e:7f:95:97", "network": {"id": "11a683b2-602b-426c-b83e-8ae9788cd50f", "bridge": "br-int", "label": "tempest-ImagesOneServerNegativeTestJSON-1977383221-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "35a7174bea8a49958c5ddc5b88634f5b", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5ba27300-88df-4c95-b9e0-a4a8b5039c3c", "external-id": "nsx-vlan-transportzone-681", "segmentation_id": 681, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc72a41da-ca", "ovs_interfaceid": "c72a41da-cac2-4070-97ac-8e9263e34645", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 611.039615] env[68571]: DEBUG oslo_concurrency.lockutils [req-c9401089-88bf-48a7-bad4-27544db31081 req-0eaab678-51f0-4123-8376-6bd22f9086c5 service nova] Releasing lock "refresh_cache-ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 612.151149] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a73180a1-613c-42ab-8de9-a93f490170e4 tempest-ServersTestFqdnHostnames-1143824028 tempest-ServersTestFqdnHostnames-1143824028-project-member] Acquiring lock "ee0a3514-6892-4ee8-bad7-9b2867ba439e" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 612.151553] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a73180a1-613c-42ab-8de9-a93f490170e4 tempest-ServersTestFqdnHostnames-1143824028 tempest-ServersTestFqdnHostnames-1143824028-project-member] Lock "ee0a3514-6892-4ee8-bad7-9b2867ba439e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 614.352612] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Acquiring lock "a6628de8-b7e9-466c-8cde-3f4f322c0faf" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 614.352612] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "a6628de8-b7e9-466c-8cde-3f4f322c0faf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 614.379050] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Acquiring lock "7a350ebc-61e6-4e4d-99bc-adb67b518395" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 614.379296] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "7a350ebc-61e6-4e4d-99bc-adb67b518395" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 626.792696] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b0a08811-0ed4-44fb-91fc-7cbafef40caf tempest-InstanceActionsV221TestJSON-79637547 tempest-InstanceActionsV221TestJSON-79637547-project-member] Acquiring lock "95bc8fb9-032a-41d7-b925-dc9b60d21735" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 626.793038] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b0a08811-0ed4-44fb-91fc-7cbafef40caf tempest-InstanceActionsV221TestJSON-79637547 tempest-InstanceActionsV221TestJSON-79637547-project-member] Lock "95bc8fb9-032a-41d7-b925-dc9b60d21735" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.001745] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 630.025954] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 630.026162] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 630.026318] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 630.026790] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 630.037972] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 630.038201] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.038377] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 630.038531] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 630.039608] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-be2d1040-eb22-400f-a79e-ecf14b7dd92b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.048685] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7494a4d7-b2e1-4b0a-9a64-a6aad6bcd3bd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.062335] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-819d0dba-2094-41bf-ae70-827af0b8ac48 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.068460] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a32f5418-0c3d-4219-8d75-eb3c173ccb4d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.098417] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180902MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 630.098605] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 630.098802] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 630.168834] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.168996] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169136] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f3b237f4-6e23-4474-b841-aa3ca8c1486f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169260] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169381] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169501] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8c30562a-4a81-4007-923c-3bc0b922f01c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169618] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169734] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169846] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.169959] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 630.195017] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.218459] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.230344] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.240284] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.249819] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0be1ddd3-e07f-49b3-a5a7-df32b5262c30 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.259543] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6894c90c-cbfb-4226-a0b5-e195f923c8e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.269473] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.278537] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.289009] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance db77f64d-5b6c-4a88-aa1c-2622832b3f58 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.299473] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a000e36-e100-4c79-a170-8cf86a4244d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.309337] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47ab9428-5860-4c42-a5ec-a9ff608790e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.318657] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a91a0cd6-a014-43c7-8723-55825c0c8662 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.328158] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6eac04c-996e-4733-a37e-d1ba61762409 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.337958] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ef53dbb8-20d3-4b5c-be29-ce75cc6c0233 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.347210] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.357152] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0d78609e-cda0-4309-af6e-7d30a939443b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.368922] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 78ce800c-1f8e-496e-9be2-24675657acb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.379230] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1ed21e6d-6b5a-4e6e-9466-b5beceda09e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.388664] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee0a3514-6892-4ee8-bad7-9b2867ba439e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.398080] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6628de8-b7e9-466c-8cde-3f4f322c0faf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.409327] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a350ebc-61e6-4e4d-99bc-adb67b518395 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.420097] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 95bc8fb9-032a-41d7-b925-dc9b60d21735 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 630.420097] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 630.420097] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 630.775156] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdd03d5a-2e65-4277-9ee7-0c5ae4dee01e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.782773] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-654458fc-d540-4d36-838e-cf59fe249a1e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.814291] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-333110d0-88cf-469e-9d79-d026f6123cdb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.821213] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0c452209-3166-40ae-895a-d2135178a0d6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 630.833941] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 630.841887] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 630.858955] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 630.858955] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.760s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 632.322243] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.322548] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.322631] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 632.322761] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 632.342819] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.342992] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344127] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344304] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344441] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344571] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344695] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344816] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.344936] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.345069] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 632.345194] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 632.345702] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.345891] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.346067] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 632.346201] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 651.212910] env[68571]: WARNING oslo_vmware.rw_handles [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 651.212910] env[68571]: ERROR oslo_vmware.rw_handles [ 651.213384] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 651.214781] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 651.215079] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Copying Virtual Disk [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/21f588c5-25bc-48e8-a9ab-68aa19fd20fd/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 651.215396] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-596723b7-8ada-43e4-8118-8a95938f969c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.223587] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Waiting for the task: (returnval){ [ 651.223587] env[68571]: value = "task-3467591" [ 651.223587] env[68571]: _type = "Task" [ 651.223587] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 651.231840] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Task: {'id': task-3467591, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 651.734237] env[68571]: DEBUG oslo_vmware.exceptions [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 651.734517] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 651.735065] env[68571]: ERROR nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 651.735065] env[68571]: Faults: ['InvalidArgument'] [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Traceback (most recent call last): [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] yield resources [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self.driver.spawn(context, instance, image_meta, [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self._vmops.spawn(context, instance, image_meta, injected_files, [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self._fetch_image_if_missing(context, vi) [ 651.735065] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] image_cache(vi, tmp_image_ds_loc) [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] vm_util.copy_virtual_disk( [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] session._wait_for_task(vmdk_copy_task) [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return self.wait_for_task(task_ref) [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return evt.wait() [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] result = hub.switch() [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 651.735366] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return self.greenlet.switch() [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self.f(*self.args, **self.kw) [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] raise exceptions.translate_fault(task_info.error) [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Faults: ['InvalidArgument'] [ 651.735694] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] [ 651.735694] env[68571]: INFO nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Terminating instance [ 651.736902] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 651.737162] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 651.737352] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2f3c6682-a23b-4c77-961b-4c2be865db21 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.743017] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 651.743017] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 651.743017] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1160075d-c8f1-4433-8be6-045c67b83ec9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.749366] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 651.749743] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2eee9e77-8242-4a58-a926-3e800c1a2db6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.752016] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 651.752283] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 651.753288] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ca94875c-ac2a-4e11-81d4-a8d9220a0541 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.757975] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for the task: (returnval){ [ 651.757975] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52cdcbe5-1b2d-a86c-f098-6264c6e6f0b0" [ 651.757975] env[68571]: _type = "Task" [ 651.757975] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 651.765310] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52cdcbe5-1b2d-a86c-f098-6264c6e6f0b0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 651.826375] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 651.826585] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 651.826795] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Deleting the datastore file [datastore1] c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 651.827069] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-dce3e337-e7f8-4186-856a-8d689ebfe9db {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 651.833073] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Waiting for the task: (returnval){ [ 651.833073] env[68571]: value = "task-3467593" [ 651.833073] env[68571]: _type = "Task" [ 651.833073] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 651.840560] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Task: {'id': task-3467593, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 652.267946] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 652.268237] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Creating directory with path [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 652.268476] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0f7ecc50-89c5-4304-8c89-6b6ceecc33a4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.280135] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Created directory with path [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 652.280350] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Fetch image to [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 652.280520] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 652.281295] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c65e598c-1b07-479e-bee3-4bc334b19c9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.287793] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6e8d248-a964-42a5-95b9-3dbc7e87a963 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.296574] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9581c43c-9bae-4342-a6c5-34f350e641d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.326484] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-688b160f-df82-4633-a619-6a7b62df24eb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.332071] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-6119a8bf-0717-4ea5-b65f-913cdb39319a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.340970] env[68571]: DEBUG oslo_vmware.api [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Task: {'id': task-3467593, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07013} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 652.341217] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 652.341397] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 652.341564] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 652.341733] env[68571]: INFO nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Took 0.60 seconds to destroy the instance on the hypervisor. [ 652.343760] env[68571]: DEBUG nova.compute.claims [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 652.343928] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 652.344164] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 652.353531] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 652.406101] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 652.465318] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 652.465501] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 652.805346] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3c517a-7c44-4139-8d8b-4220ff8e533a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.813920] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-448e9d87-1dd9-4650-a895-74042d2c85a8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.843377] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c665f4f3-3c6d-4b1d-819a-9ef10170cdcd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.850200] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-195e3342-deb5-4d2e-a26c-4fc02b36e619 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 652.862909] env[68571]: DEBUG nova.compute.provider_tree [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 652.871295] env[68571]: DEBUG nova.scheduler.client.report [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 652.888402] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.544s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 652.888959] env[68571]: ERROR nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.888959] env[68571]: Faults: ['InvalidArgument'] [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Traceback (most recent call last): [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self.driver.spawn(context, instance, image_meta, [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self._vmops.spawn(context, instance, image_meta, injected_files, [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self._fetch_image_if_missing(context, vi) [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] image_cache(vi, tmp_image_ds_loc) [ 652.888959] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] vm_util.copy_virtual_disk( [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] session._wait_for_task(vmdk_copy_task) [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return self.wait_for_task(task_ref) [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return evt.wait() [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] result = hub.switch() [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] return self.greenlet.switch() [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 652.889262] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] self.f(*self.args, **self.kw) [ 652.889510] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 652.889510] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] raise exceptions.translate_fault(task_info.error) [ 652.889510] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 652.889510] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Faults: ['InvalidArgument'] [ 652.889510] env[68571]: ERROR nova.compute.manager [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] [ 652.889871] env[68571]: DEBUG nova.compute.utils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 652.891022] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Build of instance c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471 was re-scheduled: A specified parameter was not correct: fileType [ 652.891022] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 652.891390] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 652.891561] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 652.891715] env[68571]: DEBUG nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 652.891874] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 653.420959] env[68571]: DEBUG nova.network.neutron [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 653.430857] env[68571]: INFO nova.compute.manager [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] [instance: c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471] Took 0.54 seconds to deallocate network for instance. [ 653.543995] env[68571]: INFO nova.scheduler.client.report [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Deleted allocations for instance c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471 [ 653.571582] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1cbea77b-0d7b-450f-bce5-3275b6333819 tempest-ServersAdminNegativeTestJSON-1834198279 tempest-ServersAdminNegativeTestJSON-1834198279-project-member] Lock "c4d97b3b-ab8b-4d2a-85e7-4d3c46e74471" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 102.059s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 653.583213] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 653.639838] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 653.640219] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 653.642382] env[68571]: INFO nova.compute.claims [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 654.043617] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a290473f-9c0c-43e5-bd3f-55727eae7f9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.051181] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dce126cb-55ba-4dd1-b5ca-560a13e727cf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.080141] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-418463df-d9f9-46f2-aad8-63bf6babb3c3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.087241] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55f74ae9-4f18-4557-9279-35281c95cbb7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.100093] env[68571]: DEBUG nova.compute.provider_tree [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 654.108178] env[68571]: DEBUG nova.scheduler.client.report [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 654.122478] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.482s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 654.122951] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 654.161149] env[68571]: DEBUG nova.compute.utils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 654.161708] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 654.161895] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 654.173771] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 654.220940] env[68571]: DEBUG nova.policy [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7fd729f396a44b8eb4adad474a00d4c3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8caef1b37c4c488cae4b1af9942ec8aa', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 654.240241] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 654.268122] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 654.268665] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 654.268969] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 654.269302] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 654.269585] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 654.269854] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 654.270205] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 654.270504] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 654.270809] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 654.272907] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 654.272907] env[68571]: DEBUG nova.virt.hardware [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 654.272907] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e9a2a25c-eb59-4515-a843-0ad449452ed2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.281972] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd076345-7292-4f4a-a6a2-7a81541fefaa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 654.520118] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Successfully created port: 6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 655.098978] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Successfully updated port: 6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 655.112137] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.112279] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquired lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.112432] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 655.151540] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 655.324326] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Updating instance_info_cache with network_info: [{"id": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "address": "fa:16:3e:3d:d2:4c", "network": {"id": "676d3b06-8b9f-486e-9255-b89fdd191a80", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1675932776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8caef1b37c4c488cae4b1af9942ec8aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6a6c25c8-24", "ovs_interfaceid": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.339571] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Releasing lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 655.342026] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance network_info: |[{"id": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "address": "fa:16:3e:3d:d2:4c", "network": {"id": "676d3b06-8b9f-486e-9255-b89fdd191a80", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1675932776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8caef1b37c4c488cae4b1af9942ec8aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6a6c25c8-24", "ovs_interfaceid": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 655.342155] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:3d:d2:4c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8cb478a6-872c-4a90-a8db-526b374e82ce', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6a6c25c8-247f-4b9c-95e9-4245cfdf8528', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 655.347605] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Creating folder: Project (8caef1b37c4c488cae4b1af9942ec8aa). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.348402] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2c767f6e-06d2-4cdb-92e1-e72eda140f56 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.359742] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Created folder: Project (8caef1b37c4c488cae4b1af9942ec8aa) in parent group-v692787. [ 655.359932] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Creating folder: Instances. Parent ref: group-v692823. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 655.360200] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-85917589-add4-419b-8f57-9ae7ad539aaa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.368807] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Created folder: Instances in parent group-v692823. [ 655.369099] env[68571]: DEBUG oslo.service.loopingcall [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 655.369291] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 655.369501] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-db29700b-3933-49a1-89a3-3b689ffce328 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.390860] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 655.390860] env[68571]: value = "task-3467596" [ 655.390860] env[68571]: _type = "Task" [ 655.390860] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 655.403505] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467596, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.699357] env[68571]: DEBUG nova.compute.manager [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Received event network-vif-plugged-6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 655.699585] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Acquiring lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 655.699828] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 655.699916] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 655.700350] env[68571]: DEBUG nova.compute.manager [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] No waiting events found dispatching network-vif-plugged-6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 655.700564] env[68571]: WARNING nova.compute.manager [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Received unexpected event network-vif-plugged-6a6c25c8-247f-4b9c-95e9-4245cfdf8528 for instance with vm_state building and task_state spawning. [ 655.700735] env[68571]: DEBUG nova.compute.manager [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Received event network-changed-6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 655.700919] env[68571]: DEBUG nova.compute.manager [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Refreshing instance network info cache due to event network-changed-6a6c25c8-247f-4b9c-95e9-4245cfdf8528. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 655.701128] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Acquiring lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.701266] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Acquired lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.701423] env[68571]: DEBUG nova.network.neutron [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Refreshing network info cache for port 6a6c25c8-247f-4b9c-95e9-4245cfdf8528 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 655.901022] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467596, 'name': CreateVM_Task, 'duration_secs': 0.284215} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 655.901108] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 655.901721] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 655.901886] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 655.902209] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 655.902452] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-92832dba-5d7f-4ee1-aae7-3c4705931348 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 655.906840] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for the task: (returnval){ [ 655.906840] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]524c1428-c2b7-a3ec-ec27-127696b08acb" [ 655.906840] env[68571]: _type = "Task" [ 655.906840] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 655.914566] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]524c1428-c2b7-a3ec-ec27-127696b08acb, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 655.964596] env[68571]: DEBUG nova.network.neutron [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Updated VIF entry in instance network info cache for port 6a6c25c8-247f-4b9c-95e9-4245cfdf8528. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 655.964990] env[68571]: DEBUG nova.network.neutron [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Updating instance_info_cache with network_info: [{"id": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "address": "fa:16:3e:3d:d2:4c", "network": {"id": "676d3b06-8b9f-486e-9255-b89fdd191a80", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-1675932776-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "8caef1b37c4c488cae4b1af9942ec8aa", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8cb478a6-872c-4a90-a8db-526b374e82ce", "external-id": "nsx-vlan-transportzone-835", "segmentation_id": 835, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6a6c25c8-24", "ovs_interfaceid": "6a6c25c8-247f-4b9c-95e9-4245cfdf8528", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 655.975499] env[68571]: DEBUG oslo_concurrency.lockutils [req-b39cc3be-c223-445a-bbb7-501228584364 req-1867046b-d7ec-4c34-a888-a04529c25c68 service nova] Releasing lock "refresh_cache-4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 656.418074] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 656.418074] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 656.418184] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 662.103061] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "244ba708-279e-440e-bc18-8c6ee7b83250" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 662.103362] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 689.489174] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 691.490024] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 691.490024] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 691.490024] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 691.506541] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 691.506761] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 691.506974] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 691.507217] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 691.508781] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0141e9cf-a843-42b2-881c-4f25b363efe4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.517803] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5128efb0-aebf-49fd-b5a9-cf77b2c45383 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.532255] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bec04b9a-a12b-4e38-8620-8454851e1ba1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.538566] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9ea8ca8-f906-402c-909a-16dbe938a721 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 691.566969] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180932MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 691.567142] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 691.567338] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 691.644776] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644776] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f3b237f4-6e23-4474-b841-aa3ca8c1486f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644776] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644776] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644954] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8c30562a-4a81-4007-923c-3bc0b922f01c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644954] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644954] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.644954] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.645072] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.645072] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 691.658613] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.668984] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.679195] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.687835] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0be1ddd3-e07f-49b3-a5a7-df32b5262c30 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.696751] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6894c90c-cbfb-4226-a0b5-e195f923c8e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.707956] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.717051] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.730727] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance db77f64d-5b6c-4a88-aa1c-2622832b3f58 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.741091] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a000e36-e100-4c79-a170-8cf86a4244d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.750580] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47ab9428-5860-4c42-a5ec-a9ff608790e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.760875] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a91a0cd6-a014-43c7-8723-55825c0c8662 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.771164] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6eac04c-996e-4733-a37e-d1ba61762409 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.780824] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ef53dbb8-20d3-4b5c-be29-ce75cc6c0233 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.790589] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.801014] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0d78609e-cda0-4309-af6e-7d30a939443b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.811011] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 78ce800c-1f8e-496e-9be2-24675657acb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.820852] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1ed21e6d-6b5a-4e6e-9466-b5beceda09e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.834179] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee0a3514-6892-4ee8-bad7-9b2867ba439e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.843962] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6628de8-b7e9-466c-8cde-3f4f322c0faf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.853147] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a350ebc-61e6-4e4d-99bc-adb67b518395 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.862523] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 95bc8fb9-032a-41d7-b925-dc9b60d21735 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.871300] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 691.871534] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 691.871681] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 692.220575] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-951f845b-beab-46c5-8eb9-bd2d84bd6344 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.227925] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c728ba66-526c-429c-8575-b684d84bc6eb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.258260] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-38ab9880-1c65-4e54-9e81-5c5f54e125ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.265413] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b81c9cb-155d-4b48-9232-2faf29f7cfdd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 692.278247] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 692.286363] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 692.302494] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 692.302682] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.735s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 693.297600] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.297874] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.298117] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.488794] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.488947] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 693.489148] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 693.508576] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.508739] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.508890] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509058] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509240] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509375] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509501] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509625] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509748] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509870] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 693.509995] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 693.510506] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 693.510649] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 702.013046] env[68571]: WARNING oslo_vmware.rw_handles [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 702.013046] env[68571]: ERROR oslo_vmware.rw_handles [ 702.013695] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 702.015653] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 702.016065] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Copying Virtual Disk [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/70af6db8-2a31-4683-91cf-d97df3e8d8b9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 702.016413] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c6b94c53-8808-4684-bcdc-c12d466f3eba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.025844] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for the task: (returnval){ [ 702.025844] env[68571]: value = "task-3467597" [ 702.025844] env[68571]: _type = "Task" [ 702.025844] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.037407] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Task: {'id': task-3467597, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 702.536329] env[68571]: DEBUG oslo_vmware.exceptions [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 702.536604] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.537150] env[68571]: ERROR nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 702.537150] env[68571]: Faults: ['InvalidArgument'] [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Traceback (most recent call last): [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] yield resources [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self.driver.spawn(context, instance, image_meta, [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self._fetch_image_if_missing(context, vi) [ 702.537150] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] image_cache(vi, tmp_image_ds_loc) [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] vm_util.copy_virtual_disk( [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] session._wait_for_task(vmdk_copy_task) [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return self.wait_for_task(task_ref) [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return evt.wait() [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] result = hub.switch() [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 702.537447] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return self.greenlet.switch() [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self.f(*self.args, **self.kw) [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] raise exceptions.translate_fault(task_info.error) [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Faults: ['InvalidArgument'] [ 702.537791] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] [ 702.537791] env[68571]: INFO nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Terminating instance [ 702.539021] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.539238] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 702.539490] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a66aeb15-0c70-438b-86c4-e556acc0736e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.541508] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 702.541668] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquired lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 702.541842] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 702.548587] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 702.548762] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 702.549538] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c10ff317-7a85-4179-b074-96b28214a0d9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.557094] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 702.557094] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52665a11-e498-ed48-1bd4-75b18da313d8" [ 702.557094] env[68571]: _type = "Task" [ 702.557094] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.564506] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52665a11-e498-ed48-1bd4-75b18da313d8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 702.576965] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 702.639014] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 702.647894] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Releasing lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 702.648411] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 702.648645] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 702.649885] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-217b4c98-ca37-416c-92bd-fe87aa444df1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.658358] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 702.658625] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4c65e1c4-9f42-4d25-8ab4-9022e09c67d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.687452] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 702.687697] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 702.687785] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Deleting the datastore file [datastore1] 8c30562a-4a81-4007-923c-3bc0b922f01c {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 702.688045] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b6dab3c9-4716-46d5-8693-76c60ddf07f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 702.694020] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for the task: (returnval){ [ 702.694020] env[68571]: value = "task-3467599" [ 702.694020] env[68571]: _type = "Task" [ 702.694020] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 702.701394] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Task: {'id': task-3467599, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 703.067794] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 703.068079] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating directory with path [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 703.068329] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-90eac66c-a0dd-4f9e-bb29-95fc1e0a5e04 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.079583] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created directory with path [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 703.079769] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Fetch image to [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 703.079933] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 703.080647] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bfdf949-2482-4f73-b6f1-3387e148f0cf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.086976] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f521a8fb-6aad-4240-887c-8448f8885c03 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.095620] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78104cd8-7e37-4850-9120-617224583e8e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.125200] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5813c1e7-ba7c-47ac-b3ec-29560ebdfa50 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.130730] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd0674ed-8c17-4fc4-934b-f2deeec407cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.152143] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 703.204125] env[68571]: DEBUG oslo_vmware.api [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Task: {'id': task-3467599, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043174} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 703.204384] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 703.204571] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 703.204741] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 703.204910] env[68571]: INFO nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Took 0.56 seconds to destroy the instance on the hypervisor. [ 703.205180] env[68571]: DEBUG oslo.service.loopingcall [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 703.205383] env[68571]: DEBUG nova.compute.manager [-] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 703.207771] env[68571]: DEBUG oslo_vmware.rw_handles [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 703.209447] env[68571]: DEBUG nova.compute.claims [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 703.209613] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 703.209819] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 703.272862] env[68571]: DEBUG oslo_vmware.rw_handles [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 703.272862] env[68571]: DEBUG oslo_vmware.rw_handles [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 703.634350] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06b18b38-e69e-4e6b-b650-f5bba6a74bcc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.641871] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3189b515-b0dc-413c-8c9c-78f6380c6b2c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.671502] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17b42bf4-ab58-41ea-9f14-6e03bb4624ef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.678409] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e48bc0-6e36-445a-8bac-881cd139a05d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 703.691057] env[68571]: DEBUG nova.compute.provider_tree [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 703.699369] env[68571]: DEBUG nova.scheduler.client.report [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 703.713034] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.502s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.713034] env[68571]: ERROR nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.713034] env[68571]: Faults: ['InvalidArgument'] [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Traceback (most recent call last): [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self.driver.spawn(context, instance, image_meta, [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 703.713034] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self._fetch_image_if_missing(context, vi) [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] image_cache(vi, tmp_image_ds_loc) [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] vm_util.copy_virtual_disk( [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] session._wait_for_task(vmdk_copy_task) [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return self.wait_for_task(task_ref) [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return evt.wait() [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] result = hub.switch() [ 703.713350] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] return self.greenlet.switch() [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] self.f(*self.args, **self.kw) [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] raise exceptions.translate_fault(task_info.error) [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Faults: ['InvalidArgument'] [ 703.713633] env[68571]: ERROR nova.compute.manager [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] [ 703.713633] env[68571]: DEBUG nova.compute.utils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 703.714792] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Build of instance 8c30562a-4a81-4007-923c-3bc0b922f01c was re-scheduled: A specified parameter was not correct: fileType [ 703.714792] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 703.715215] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 703.715387] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquiring lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 703.715534] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Acquired lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 703.715694] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 703.739340] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 703.801638] env[68571]: DEBUG nova.network.neutron [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 703.811375] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Releasing lock "refresh_cache-8c30562a-4a81-4007-923c-3bc0b922f01c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 703.811651] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 703.811847] env[68571]: DEBUG nova.compute.manager [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] [instance: 8c30562a-4a81-4007-923c-3bc0b922f01c] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 703.899396] env[68571]: INFO nova.scheduler.client.report [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Deleted allocations for instance 8c30562a-4a81-4007-923c-3bc0b922f01c [ 703.925739] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c868e849-a073-4b29-843d-ff042737ad65 tempest-ServersAdmin275Test-1844452862 tempest-ServersAdmin275Test-1844452862-project-member] Lock "8c30562a-4a81-4007-923c-3bc0b922f01c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 146.563s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 703.949720] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 704.002448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 704.002760] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 704.004274] env[68571]: INFO nova.compute.claims [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 704.396127] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-352fc4d9-2ce2-4887-8744-db48945fdd29 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.404655] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-092c18c2-05f2-44ab-b362-97de66399535 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.434966] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a028d588-4a8a-4de1-8e68-728ebec15017 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.442442] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-264f1eca-9ee5-45b5-ad84-46104bb11154 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.456622] env[68571]: DEBUG nova.compute.provider_tree [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 704.466726] env[68571]: DEBUG nova.scheduler.client.report [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 704.481374] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.479s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 704.481735] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 704.519875] env[68571]: DEBUG nova.compute.utils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 704.521210] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 704.521418] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 704.530178] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 704.562162] env[68571]: INFO nova.virt.block_device [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Booting with volume 7451f85f-ba63-4b63-8e08-4ec8061213ed at /dev/sda [ 704.584388] env[68571]: DEBUG nova.policy [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'dcb37e18babe42f7b84ba856aeef05e6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b8e068f9c390406892fe822978985780', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 704.617316] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-162c5055-a825-4cb6-aada-be4e3abc199d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.628201] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54815ef0-373c-4cf0-8601-7e9d4de6e186 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.658030] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c9f195a2-c789-49aa-99db-7f25954cbc9c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.665706] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1a76bf69-97d6-415a-8323-95a370adadf9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.693792] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1020e79-2087-4d9d-bf8e-b34d6293552f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.700448] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1ce8720-c4d7-423b-b5e2-c1583e20a028 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 704.713965] env[68571]: DEBUG nova.virt.block_device [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updating existing volume attachment record: 98e34e71-7a43-40d6-85af-36b190ee2a0b {{(pid=68571) _volume_attach /opt/stack/nova/nova/virt/block_device.py:631}} [ 704.956735] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Successfully created port: e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 704.994246] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 704.994882] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=,container_format=,created_at=,direct_url=,disk_format=,id=,min_disk=0,min_ram=0,name=,owner=,properties=ImageMetaProps,protected=,size=1073741824,status='active',tags=,updated_at=,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 704.995246] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 704.995969] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 704.995969] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 704.996487] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 704.996487] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 704.996679] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 704.996943] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 704.997438] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 704.997555] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 704.997948] env[68571]: DEBUG nova.virt.hardware [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 704.999960] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ec102f2-c6d0-423b-a712-b3162adb61ca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.014724] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9266dac2-ba73-4adf-9364-be5abaf3fd12 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 705.611376] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Successfully updated port: e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 705.625487] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 705.625642] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquired lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 705.625796] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 705.671533] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 705.719029] env[68571]: DEBUG nova.compute.manager [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Received event network-vif-plugged-e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 705.719314] env[68571]: DEBUG oslo_concurrency.lockutils [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] Acquiring lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 705.719498] env[68571]: DEBUG oslo_concurrency.lockutils [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 705.719789] env[68571]: DEBUG oslo_concurrency.lockutils [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 705.719973] env[68571]: DEBUG nova.compute.manager [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] No waiting events found dispatching network-vif-plugged-e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 705.720154] env[68571]: WARNING nova.compute.manager [req-35669316-450f-4cec-9e5e-db4f7b3e4ca3 req-92a483a0-228b-4bb3-b998-c6139c8826c1 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Received unexpected event network-vif-plugged-e15fdec4-63a8-4a6d-8c72-d439ed56c710 for instance with vm_state building and task_state spawning. [ 706.074936] env[68571]: DEBUG nova.network.neutron [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updating instance_info_cache with network_info: [{"id": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "address": "fa:16:3e:07:3b:76", "network": {"id": "6c249714-cde3-43c8-9f93-1cc6dba49eef", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-575203170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b8e068f9c390406892fe822978985780", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "572b7281-aad3-45fa-9cb2-fc1c70569948", "external-id": "nsx-vlan-transportzone-722", "segmentation_id": 722, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape15fdec4-63", "ovs_interfaceid": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 706.085700] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Releasing lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 706.086053] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance network_info: |[{"id": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "address": "fa:16:3e:07:3b:76", "network": {"id": "6c249714-cde3-43c8-9f93-1cc6dba49eef", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-575203170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b8e068f9c390406892fe822978985780", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "572b7281-aad3-45fa-9cb2-fc1c70569948", "external-id": "nsx-vlan-transportzone-722", "segmentation_id": 722, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape15fdec4-63", "ovs_interfaceid": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 706.086417] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:07:3b:76', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '572b7281-aad3-45fa-9cb2-fc1c70569948', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'e15fdec4-63a8-4a6d-8c72-d439ed56c710', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 706.093923] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Creating folder: Project (b8e068f9c390406892fe822978985780). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 706.094481] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6e5d33cf-ca75-4d7a-8292-5ecb3a55cb29 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.107670] env[68571]: WARNING suds.client [-] Web service reported a SOAP processing fault using an unexpected HTTP status code 200. Reporting as an internal server error. [ 706.107826] env[68571]: DEBUG oslo_vmware.api [-] Fault list: [DuplicateName] {{(pid=68571) _invoke_api /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:337}} [ 706.108200] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Folder already exists: Project (b8e068f9c390406892fe822978985780). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1599}} [ 706.108389] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Creating folder: Instances. Parent ref: group-v692788. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 706.108616] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-825f44f8-e774-435d-9602-fc29fad52144 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.119284] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Created folder: Instances in parent group-v692788. [ 706.119500] env[68571]: DEBUG oslo.service.loopingcall [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 706.119676] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 706.119864] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-f462136b-ad3f-44d2-9dd4-7e6f54898e5c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.138895] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 706.138895] env[68571]: value = "task-3467602" [ 706.138895] env[68571]: _type = "Task" [ 706.138895] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 706.149599] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467602, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 706.648922] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467602, 'name': CreateVM_Task, 'duration_secs': 0.29665} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 706.649327] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 706.649963] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Block device information present: {'root_device_name': '/dev/sda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'mount_device': '/dev/sda', 'guest_format': None, 'device_type': None, 'delete_on_termination': True, 'disk_bus': None, 'attachment_id': '98e34e71-7a43-40d6-85af-36b190ee2a0b', 'boot_index': 0, 'connection_info': {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692809', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'name': 'volume-7451f85f-ba63-4b63-8e08-4ec8061213ed', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ccd1b692-f511-43c8-8b3d-ce92ef27670f', 'attached_at': '', 'detached_at': '', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'serial': '7451f85f-ba63-4b63-8e08-4ec8061213ed'}, 'volume_type': None}], 'swap': None} {{(pid=68571) spawn /opt/stack/nova/nova/virt/vmwareapi/vmops.py:799}} [ 706.650202] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Root volume attach. Driver type: vmdk {{(pid=68571) attach_root_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:661}} [ 706.651059] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-386b22a1-8e21-4594-ba6c-77bc47e865c3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.658682] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1ca422b-46fd-43b7-8a0b-659edd3663a5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.664915] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c685eac-8d55-4281-9141-23d93e31b26e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.672642] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.RelocateVM_Task with opID=oslo.vmware-81f325f5-4d86-4e3c-baeb-1d7eae2b9bc1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 706.678995] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 706.678995] env[68571]: value = "task-3467603" [ 706.678995] env[68571]: _type = "Task" [ 706.678995] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 706.687545] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467603, 'name': RelocateVM_Task} progress is 5%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.188729] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467603, 'name': RelocateVM_Task, 'duration_secs': 0.360289} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 707.189123] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Volume attach. Driver type: vmdk {{(pid=68571) attach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:439}} [ 707.189429] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] _attach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692809', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'name': 'volume-7451f85f-ba63-4b63-8e08-4ec8061213ed', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ccd1b692-f511-43c8-8b3d-ce92ef27670f', 'attached_at': '', 'detached_at': '', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'serial': '7451f85f-ba63-4b63-8e08-4ec8061213ed'} {{(pid=68571) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:336}} [ 707.190217] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-823b7e7a-3f39-420d-b97c-a2a7923192cc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.207327] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3db71c75-9409-4483-9fb2-80a2ab4611fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.231694] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Reconfiguring VM instance instance-0000000d to attach disk [datastore1] volume-7451f85f-ba63-4b63-8e08-4ec8061213ed/volume-7451f85f-ba63-4b63-8e08-4ec8061213ed.vmdk or device None with type thin {{(pid=68571) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:81}} [ 707.231994] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-b7ba2193-c61c-4908-808a-25c87ac44edf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.251286] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 707.251286] env[68571]: value = "task-3467604" [ 707.251286] env[68571]: _type = "Task" [ 707.251286] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.259394] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467604, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.760503] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467604, 'name': ReconfigVM_Task, 'duration_secs': 0.247438} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 707.760787] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Reconfigured VM instance instance-0000000d to attach disk [datastore1] volume-7451f85f-ba63-4b63-8e08-4ec8061213ed/volume-7451f85f-ba63-4b63-8e08-4ec8061213ed.vmdk or device None with type thin {{(pid=68571) attach_disk_to_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:88}} [ 707.765390] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-7b9d5449-ea0e-4005-88fd-30782e53646b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 707.781363] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 707.781363] env[68571]: value = "task-3467605" [ 707.781363] env[68571]: _type = "Task" [ 707.781363] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 707.791381] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467605, 'name': ReconfigVM_Task} progress is 6%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 707.886193] env[68571]: DEBUG nova.compute.manager [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Received event network-changed-e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 707.886355] env[68571]: DEBUG nova.compute.manager [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Refreshing instance network info cache due to event network-changed-e15fdec4-63a8-4a6d-8c72-d439ed56c710. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 707.886524] env[68571]: DEBUG oslo_concurrency.lockutils [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] Acquiring lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 707.886657] env[68571]: DEBUG oslo_concurrency.lockutils [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] Acquired lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 707.886807] env[68571]: DEBUG nova.network.neutron [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Refreshing network info cache for port e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 708.170233] env[68571]: DEBUG nova.network.neutron [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updated VIF entry in instance network info cache for port e15fdec4-63a8-4a6d-8c72-d439ed56c710. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 708.170800] env[68571]: DEBUG nova.network.neutron [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updating instance_info_cache with network_info: [{"id": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "address": "fa:16:3e:07:3b:76", "network": {"id": "6c249714-cde3-43c8-9f93-1cc6dba49eef", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-575203170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b8e068f9c390406892fe822978985780", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "572b7281-aad3-45fa-9cb2-fc1c70569948", "external-id": "nsx-vlan-transportzone-722", "segmentation_id": 722, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape15fdec4-63", "ovs_interfaceid": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 708.182010] env[68571]: DEBUG oslo_concurrency.lockutils [req-50eb4a05-2015-452b-889f-ba65872ee72a req-57ccae19-452f-408e-9ba8-58b3ccb53558 service nova] Releasing lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 708.292393] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467605, 'name': ReconfigVM_Task, 'duration_secs': 0.124378} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 708.292701] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Attached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692809', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'name': 'volume-7451f85f-ba63-4b63-8e08-4ec8061213ed', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ccd1b692-f511-43c8-8b3d-ce92ef27670f', 'attached_at': '', 'detached_at': '', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'serial': '7451f85f-ba63-4b63-8e08-4ec8061213ed'} {{(pid=68571) _attach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:361}} [ 708.293408] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.Rename_Task with opID=oslo.vmware-d860b982-4c67-4df1-8e13-602caca9698b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.300304] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 708.300304] env[68571]: value = "task-3467606" [ 708.300304] env[68571]: _type = "Task" [ 708.300304] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.308115] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467606, 'name': Rename_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 708.810586] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467606, 'name': Rename_Task, 'duration_secs': 0.122613} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 708.810871] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Powering on the VM {{(pid=68571) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1442}} [ 708.812014] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOnVM_Task with opID=oslo.vmware-4682b9e4-eda1-4205-a97f-1d82c8650450 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 708.817702] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 708.817702] env[68571]: value = "task-3467607" [ 708.817702] env[68571]: _type = "Task" [ 708.817702] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 708.825562] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467607, 'name': PowerOnVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 709.327115] env[68571]: DEBUG oslo_vmware.api [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467607, 'name': PowerOnVM_Task, 'duration_secs': 0.43152} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 709.327252] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Powered on the VM {{(pid=68571) power_on_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1448}} [ 709.327450] env[68571]: INFO nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Took 4.33 seconds to spawn the instance on the hypervisor. [ 709.327849] env[68571]: DEBUG nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Checking state {{(pid=68571) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 709.328493] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-665f8171-eeac-46d9-8094-2c226deca78d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.391222] env[68571]: INFO nova.compute.manager [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Took 5.40 seconds to build instance. [ 709.404526] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a6413be3-9789-40b5-b0e0-1f4cb198c4c0 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 136.938s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 709.416981] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 709.475808] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 709.475969] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 709.477363] env[68571]: INFO nova.compute.claims [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 709.888180] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9b1da2a-442e-4562-acb4-0ffb3f7e327f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.895830] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3f6758b3-6ecd-4e7e-be9a-182fa8a4586f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.925294] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-062c4674-bc26-4168-9576-2c59b66c908d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.932556] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-93b1f860-600b-4347-945a-8fe8f1dde2d2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 709.946974] env[68571]: DEBUG nova.compute.provider_tree [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 709.957513] env[68571]: DEBUG nova.scheduler.client.report [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 709.971992] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.496s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 709.972461] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 710.007530] env[68571]: DEBUG nova.compute.utils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 710.009194] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 710.009298] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 710.025662] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 710.094719] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 710.096829] env[68571]: DEBUG nova.policy [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd277a30408e842518486b4a242225d26', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b7ef9161f1f54bd2854a2cd4d99d2c77', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 710.118977] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 710.119242] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 710.119399] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 710.119579] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 710.119725] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 710.120028] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 710.120097] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 710.120280] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 710.120526] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 710.120790] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 710.121086] env[68571]: DEBUG nova.virt.hardware [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 710.122227] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af6d215-1fd0-4dc7-9cb0-51aff4c750fe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.130061] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce1db76b-4a57-4d1d-b3a8-90ba5c388792 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 710.448073] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Successfully created port: 0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 711.134519] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Successfully updated port: 0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 711.148918] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 711.149081] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquired lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 711.149252] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 711.192138] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 711.375095] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Updating instance_info_cache with network_info: [{"id": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "address": "fa:16:3e:13:b9:b5", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c6aa2aa-20", "ovs_interfaceid": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 711.398047] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Releasing lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 711.398047] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance network_info: |[{"id": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "address": "fa:16:3e:13:b9:b5", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c6aa2aa-20", "ovs_interfaceid": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 711.398262] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:13:b9:b5', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '0c6aa2aa-2095-4eba-8062-de34ad1b3c2a', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 711.406325] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Creating folder: Project (b7ef9161f1f54bd2854a2cd4d99d2c77). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.406646] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2f6c2764-1fda-4bda-9214-7e148fe95cfe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.418408] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Created folder: Project (b7ef9161f1f54bd2854a2cd4d99d2c77) in parent group-v692787. [ 711.418852] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Creating folder: Instances. Parent ref: group-v692828. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 711.419251] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-04c4e447-c89d-4e69-8101-5ece49920175 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.430899] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Created folder: Instances in parent group-v692828. [ 711.430899] env[68571]: DEBUG oslo.service.loopingcall [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 711.430899] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 711.430899] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-41bd567a-d0fb-4fca-988f-3ee1693f40a5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.455067] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 711.455067] env[68571]: value = "task-3467610" [ 711.455067] env[68571]: _type = "Task" [ 711.455067] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.467680] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467610, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 711.576469] env[68571]: DEBUG nova.compute.manager [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Received event network-vif-plugged-0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 711.576469] env[68571]: DEBUG oslo_concurrency.lockutils [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] Acquiring lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 711.576469] env[68571]: DEBUG oslo_concurrency.lockutils [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 711.576469] env[68571]: DEBUG oslo_concurrency.lockutils [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 711.577351] env[68571]: DEBUG nova.compute.manager [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] No waiting events found dispatching network-vif-plugged-0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 711.577887] env[68571]: WARNING nova.compute.manager [req-6c97838f-67d2-4216-b501-305f9c39701a req-74a77853-27fc-45bf-b95c-56f047803f43 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Received unexpected event network-vif-plugged-0c6aa2aa-2095-4eba-8062-de34ad1b3c2a for instance with vm_state building and task_state spawning. [ 711.972262] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467610, 'name': CreateVM_Task, 'duration_secs': 0.284032} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 711.972262] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 711.980141] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 711.980141] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 711.980141] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 711.980141] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0f60aaae-3f0d-4ee9-8737-02948203a5f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 711.985673] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for the task: (returnval){ [ 711.985673] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]523dc097-00c9-1b11-1a34-0b02d00d4be3" [ 711.985673] env[68571]: _type = "Task" [ 711.985673] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 711.995697] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]523dc097-00c9-1b11-1a34-0b02d00d4be3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 712.447654] env[68571]: DEBUG nova.compute.manager [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Received event network-changed-e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 712.447902] env[68571]: DEBUG nova.compute.manager [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Refreshing instance network info cache due to event network-changed-e15fdec4-63a8-4a6d-8c72-d439ed56c710. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 712.449341] env[68571]: DEBUG oslo_concurrency.lockutils [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] Acquiring lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 712.449341] env[68571]: DEBUG oslo_concurrency.lockutils [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] Acquired lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 712.449341] env[68571]: DEBUG nova.network.neutron [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Refreshing network info cache for port e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 712.497735] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 712.498147] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 712.498499] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.109607] env[68571]: DEBUG nova.network.neutron [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updated VIF entry in instance network info cache for port e15fdec4-63a8-4a6d-8c72-d439ed56c710. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 713.109938] env[68571]: DEBUG nova.network.neutron [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updating instance_info_cache with network_info: [{"id": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "address": "fa:16:3e:07:3b:76", "network": {"id": "6c249714-cde3-43c8-9f93-1cc6dba49eef", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-575203170-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "10.180.180.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "b8e068f9c390406892fe822978985780", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "572b7281-aad3-45fa-9cb2-fc1c70569948", "external-id": "nsx-vlan-transportzone-722", "segmentation_id": 722, "bound_drivers": {"0": "nsxv3"}}, "devname": "tape15fdec4-63", "ovs_interfaceid": "e15fdec4-63a8-4a6d-8c72-d439ed56c710", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 713.127172] env[68571]: DEBUG oslo_concurrency.lockutils [req-e7880af5-7552-44c0-967f-d3d1296d7a05 req-23091a21-e1fc-4ae4-a68b-592b409ebed6 service nova] Releasing lock "refresh_cache-ccd1b692-f511-43c8-8b3d-ce92ef27670f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 713.681197] env[68571]: DEBUG nova.compute.manager [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Received event network-changed-0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 713.681197] env[68571]: DEBUG nova.compute.manager [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Refreshing instance network info cache due to event network-changed-0c6aa2aa-2095-4eba-8062-de34ad1b3c2a. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 713.681197] env[68571]: DEBUG oslo_concurrency.lockutils [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] Acquiring lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 713.681197] env[68571]: DEBUG oslo_concurrency.lockutils [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] Acquired lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 713.681197] env[68571]: DEBUG nova.network.neutron [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Refreshing network info cache for port 0c6aa2aa-2095-4eba-8062-de34ad1b3c2a {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 714.250872] env[68571]: DEBUG nova.network.neutron [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Updated VIF entry in instance network info cache for port 0c6aa2aa-2095-4eba-8062-de34ad1b3c2a. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 714.251649] env[68571]: DEBUG nova.network.neutron [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Updating instance_info_cache with network_info: [{"id": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "address": "fa:16:3e:13:b9:b5", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.34", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap0c6aa2aa-20", "ovs_interfaceid": "0c6aa2aa-2095-4eba-8062-de34ad1b3c2a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 714.260853] env[68571]: DEBUG oslo_concurrency.lockutils [req-6f8b8e5b-476e-4bfb-8c27-faab1a17fc73 req-ee345daf-368d-47a7-897c-5328d714f911 service nova] Releasing lock "refresh_cache-3adaf481-5844-45ac-8dc9-eb396a47ed1c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 717.171763] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 717.172121] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 727.503688] env[68571]: INFO nova.compute.manager [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Rebuilding instance [ 727.557636] env[68571]: DEBUG nova.compute.manager [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Checking state {{(pid=68571) _get_power_state /opt/stack/nova/nova/compute/manager.py:1766}} [ 727.558556] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba3f1b72-9469-4bdf-ba1c-612612f1ebfa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.614318] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Powering off the VM {{(pid=68571) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 727.614858] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-1ccb6f07-bf16-4d43-a3fb-9618dcaeb34e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 727.622681] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 727.622681] env[68571]: value = "task-3467611" [ 727.622681] env[68571]: _type = "Task" [ 727.622681] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 727.631869] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467611, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 728.132563] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467611, 'name': PowerOffVM_Task, 'duration_secs': 0.166289} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 728.132921] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Powered off the VM {{(pid=68571) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1507}} [ 728.133484] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Powering off the VM {{(pid=68571) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1502}} [ 728.133728] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.PowerOffVM_Task with opID=oslo.vmware-07b4fd4f-3e4b-4fac-89ea-fd4ff0c065c4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.140030] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 728.140030] env[68571]: value = "task-3467612" [ 728.140030] env[68571]: _type = "Task" [ 728.140030] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 728.147525] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467612, 'name': PowerOffVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 728.650421] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] VM already powered off {{(pid=68571) power_off_instance /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1509}} [ 728.650702] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Volume detach. Driver type: vmdk {{(pid=68571) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 728.650909] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] _detach_volume_vmdk: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692809', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'name': 'volume-7451f85f-ba63-4b63-8e08-4ec8061213ed', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ccd1b692-f511-43c8-8b3d-ce92ef27670f', 'attached_at': '', 'detached_at': '', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'serial': '7451f85f-ba63-4b63-8e08-4ec8061213ed'} {{(pid=68571) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:571}} [ 728.651685] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da31a1e1-a5ff-4d76-afe3-964752f031fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.669348] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe567518-8b36-4294-88dc-ac1de12407dc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.675485] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac64ec66-aadb-4a58-b5f0-3ddf8e36ffcd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.692514] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7ba0d7-39d8-47f3-9174-8e8bbe2961e5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.707115] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] The volume has not been displaced from its original location: [datastore1] volume-7451f85f-ba63-4b63-8e08-4ec8061213ed/volume-7451f85f-ba63-4b63-8e08-4ec8061213ed.vmdk. No consolidation needed. {{(pid=68571) _consolidate_vmdk_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:504}} [ 728.712442] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Reconfiguring VM instance instance-0000000d to detach disk 2000 {{(pid=68571) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:122}} [ 728.712766] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-88106a74-c1f2-4f04-b9fc-99b71ba3c68d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 728.730673] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 728.730673] env[68571]: value = "task-3467613" [ 728.730673] env[68571]: _type = "Task" [ 728.730673] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 728.740431] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467613, 'name': ReconfigVM_Task} progress is 6%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 729.241484] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467613, 'name': ReconfigVM_Task, 'duration_secs': 0.174911} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 729.241778] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Reconfigured VM instance instance-0000000d to detach disk 2000 {{(pid=68571) detach_disk_from_vm /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:127}} [ 729.246610] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.ReconfigVM_Task with opID=oslo.vmware-f939ade7-3ad7-49f8-86d7-1696ed8b026b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.261954] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 729.261954] env[68571]: value = "task-3467614" [ 729.261954] env[68571]: _type = "Task" [ 729.261954] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.270492] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467614, 'name': ReconfigVM_Task} progress is 5%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 729.773067] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467614, 'name': ReconfigVM_Task, 'duration_secs': 0.112141} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 729.773067] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Detached VMDK: {'driver_volume_type': 'vmdk', 'data': {'volume': 'vm-692809', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'name': 'volume-7451f85f-ba63-4b63-8e08-4ec8061213ed', 'profile_id': None, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ccd1b692-f511-43c8-8b3d-ce92ef27670f', 'attached_at': '', 'detached_at': '', 'volume_id': '7451f85f-ba63-4b63-8e08-4ec8061213ed', 'serial': '7451f85f-ba63-4b63-8e08-4ec8061213ed'} {{(pid=68571) _detach_volume_vmdk /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:605}} [ 729.773067] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 729.773703] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cce5ccfd-79f8-4903-857e-f18accbbcef1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.780018] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 729.780236] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-20482b6e-237b-4063-817f-bfa4d1da07c5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.846122] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 729.846360] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 729.846539] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Deleting the datastore file [datastore1] ccd1b692-f511-43c8-8b3d-ce92ef27670f {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 729.846817] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7999ccfb-3a0a-4493-a94f-3558557fc11c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 729.853257] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for the task: (returnval){ [ 729.853257] env[68571]: value = "task-3467616" [ 729.853257] env[68571]: _type = "Task" [ 729.853257] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 729.862479] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467616, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 730.366824] env[68571]: DEBUG oslo_vmware.api [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Task: {'id': task-3467616, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.082352} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 730.367243] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 730.367537] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 730.367828] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 730.427078] env[68571]: DEBUG nova.virt.vmwareapi.volumeops [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Volume detach. Driver type: vmdk {{(pid=68571) detach_volume /opt/stack/nova/nova/virt/vmwareapi/volumeops.py:646}} [ 730.427416] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-cb8c1b27-fbd6-4f9c-bbd9-160b6941c4d9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.435453] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8a63cfd-8cfb-412b-a952-c09bf42dbcc7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 730.464598] env[68571]: ERROR nova.compute.manager [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Failed to detach volume 7451f85f-ba63-4b63-8e08-4ec8061213ed from /dev/sda: nova.exception.InstanceNotFound: Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f could not be found. [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Traceback (most recent call last): [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 4116, in _do_rebuild_instance [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self.driver.rebuild(**kwargs) [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise NotImplementedError() [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] NotImplementedError [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] During handling of the above exception, another exception occurred: [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Traceback (most recent call last): [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3539, in _detach_root_volume [ 730.464598] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self.driver.detach_volume(context, old_connection_info, [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] return self._volumeops.detach_volume(connection_info, instance) [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._detach_volume_vmdk(connection_info, instance) [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] stable_ref.fetch_moref(session) [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise exception.InstanceNotFound(instance_id=self._uuid) [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] nova.exception.InstanceNotFound: Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f could not be found. [ 730.464970] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.604449] env[68571]: DEBUG nova.compute.utils [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Build of instance ccd1b692-f511-43c8-8b3d-ce92ef27670f aborted: Failed to rebuild volume backed instance. {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 730.606953] env[68571]: ERROR nova.compute.manager [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Setting instance vm_state to ERROR: nova.exception.BuildAbortException: Build of instance ccd1b692-f511-43c8-8b3d-ce92ef27670f aborted: Failed to rebuild volume backed instance. [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Traceback (most recent call last): [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 4116, in _do_rebuild_instance [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self.driver.rebuild(**kwargs) [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/driver.py", line 384, in rebuild [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise NotImplementedError() [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] NotImplementedError [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] During handling of the above exception, another exception occurred: [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Traceback (most recent call last): [ 730.606953] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3574, in _rebuild_volume_backed_instance [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._detach_root_volume(context, instance, root_bdm) [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3553, in _detach_root_volume [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] with excutils.save_and_reraise_exception(): [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self.force_reraise() [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise self.value [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3539, in _detach_root_volume [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self.driver.detach_volume(context, old_connection_info, [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 552, in detach_volume [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] return self._volumeops.detach_volume(connection_info, instance) [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 649, in detach_volume [ 730.607343] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._detach_volume_vmdk(connection_info, instance) [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/volumeops.py", line 569, in _detach_volume_vmdk [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] vm_ref = vm_util.get_vm_ref(self._session, instance) [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1135, in get_vm_ref [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] stable_ref.fetch_moref(session) [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1126, in fetch_moref [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise exception.InstanceNotFound(instance_id=self._uuid) [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] nova.exception.InstanceNotFound: Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f could not be found. [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] During handling of the above exception, another exception occurred: [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Traceback (most recent call last): [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 10835, in _error_out_instance_on_exception [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] yield [ 730.607709] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3842, in rebuild_instance [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._do_rebuild_instance_with_claim( [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3928, in _do_rebuild_instance_with_claim [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._do_rebuild_instance( [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 4120, in _do_rebuild_instance [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._rebuild_default_impl(**kwargs) [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3697, in _rebuild_default_impl [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] self._rebuild_volume_backed_instance( [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] File "/opt/stack/nova/nova/compute/manager.py", line 3589, in _rebuild_volume_backed_instance [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] raise exception.BuildAbortException( [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] nova.exception.BuildAbortException: Build of instance ccd1b692-f511-43c8-8b3d-ce92ef27670f aborted: Failed to rebuild volume backed instance. [ 730.608095] env[68571]: ERROR nova.compute.manager [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] [ 730.714936] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 730.715295] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.064517] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34bcbdaa-e4fe-4930-8c74-30af4d5de007 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.072564] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6eb2a76-b7dc-4779-bc69-fb8774ca7dd5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.103189] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-864ea993-b1c8-49f9-8865-f7772dd947ff {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.110494] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9d81d2a-726d-4a00-937f-f9652bc3c36a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.123286] env[68571]: DEBUG nova.compute.provider_tree [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 731.131904] env[68571]: DEBUG nova.scheduler.client.report [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 731.147391] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.432s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.147641] env[68571]: INFO nova.compute.manager [None req-93d2f84e-4d5e-42be-b830-80e419c8dfa4 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Successfully reverted task state from rebuilding on failure for instance. [ 731.835350] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.836059] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.836059] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 731.836059] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 731.836249] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 731.838378] env[68571]: INFO nova.compute.manager [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Terminating instance [ 731.841040] env[68571]: DEBUG nova.compute.manager [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 731.841040] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-0b5da63b-a53f-4849-990b-802cf42f55ad {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.850113] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c8b736-1444-41e9-9160-9fe61082b2f9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.880483] env[68571]: WARNING nova.virt.vmwareapi.driver [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance does not exists. Proceeding to delete instance properties on datastore: nova.exception.InstanceNotFound: Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f could not be found. [ 731.880714] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 731.881022] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-08f05157-aac7-4f04-bd09-efe8d4af42fc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.889160] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2bb33ee8-f93c-49b8-8ef6-c2095beac2a3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 731.920447] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ccd1b692-f511-43c8-8b3d-ce92ef27670f could not be found. [ 731.920700] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 731.920914] env[68571]: INFO nova.compute.manager [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Took 0.08 seconds to destroy the instance on the hypervisor. [ 731.921216] env[68571]: DEBUG oslo.service.loopingcall [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 731.921475] env[68571]: DEBUG nova.compute.manager [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 731.921595] env[68571]: DEBUG nova.network.neutron [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 732.548012] env[68571]: DEBUG nova.network.neutron [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 732.559541] env[68571]: INFO nova.compute.manager [-] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Took 0.64 seconds to deallocate network for instance. [ 732.600733] env[68571]: DEBUG nova.compute.manager [req-721fcec0-14ca-45e8-b20f-3aa293df8cca req-a09398c7-d46c-4701-aeb6-a16b3b574cd3 service nova] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Received event network-vif-deleted-e15fdec4-63a8-4a6d-8c72-d439ed56c710 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 732.645195] env[68571]: INFO nova.compute.manager [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Took 0.09 seconds to detach 1 volumes for instance. [ 732.650562] env[68571]: DEBUG nova.compute.manager [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Deleting volume: 7451f85f-ba63-4b63-8e08-4ec8061213ed {{(pid=68571) _cleanup_volumes /opt/stack/nova/nova/compute/manager.py:3221}} [ 732.760940] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 732.760940] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 732.760940] env[68571]: DEBUG nova.objects.instance [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lazy-loading 'resources' on Instance uuid ccd1b692-f511-43c8-8b3d-ce92ef27670f {{(pid=68571) obj_load_attr /opt/stack/nova/nova/objects/instance.py:1152}} [ 733.242016] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4de2b12-4d1a-49cf-9a58-a9694b3fff7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.253227] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bc127b2-6cc6-4d53-97f4-8e22b46e8f49 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.288561] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b727807f-ffcc-41f5-89c0-8984eaaa317f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.298675] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c982b1bb-ce92-427c-8799-fa6a34f93085 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 733.314856] env[68571]: DEBUG nova.compute.provider_tree [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 733.329935] env[68571]: DEBUG nova.scheduler.client.report [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 733.345906] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 733.417579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-403ffb9d-e936-4ddb-a828-521108fe02c6 tempest-ServerActionsV293TestJSON-1571628166 tempest-ServerActionsV293TestJSON-1571628166-project-member] Lock "ccd1b692-f511-43c8-8b3d-ce92ef27670f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 1.582s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 740.465030] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fffdb21b-53e3-451f-b39c-232bf6564e63 tempest-ServerTagsTestJSON-2108165716 tempest-ServerTagsTestJSON-2108165716-project-member] Acquiring lock "394b41bd-e7f7-4a77-87d1-6777e0991d50" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 740.465347] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fffdb21b-53e3-451f-b39c-232bf6564e63 tempest-ServerTagsTestJSON-2108165716 tempest-ServerTagsTestJSON-2108165716-project-member] Lock "394b41bd-e7f7-4a77-87d1-6777e0991d50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 749.489928] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 751.246328] env[68571]: WARNING oslo_vmware.rw_handles [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 751.246328] env[68571]: ERROR oslo_vmware.rw_handles [ 751.246954] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 751.248543] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 751.248877] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Copying Virtual Disk [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/5fe6d403-342e-4105-a38c-47d0094f535a/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 751.249188] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-00e5339d-32b0-4cf6-987c-4c598cfc1600 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.258807] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 751.258807] env[68571]: value = "task-3467618" [ 751.258807] env[68571]: _type = "Task" [ 751.258807] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.266830] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': task-3467618, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 751.556868] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 751.768783] env[68571]: DEBUG oslo_vmware.exceptions [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 751.769097] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 751.769699] env[68571]: ERROR nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.769699] env[68571]: Faults: ['InvalidArgument'] [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Traceback (most recent call last): [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] yield resources [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self.driver.spawn(context, instance, image_meta, [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self._fetch_image_if_missing(context, vi) [ 751.769699] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] image_cache(vi, tmp_image_ds_loc) [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] vm_util.copy_virtual_disk( [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] session._wait_for_task(vmdk_copy_task) [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return self.wait_for_task(task_ref) [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return evt.wait() [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] result = hub.switch() [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 751.770075] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return self.greenlet.switch() [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self.f(*self.args, **self.kw) [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] raise exceptions.translate_fault(task_info.error) [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Faults: ['InvalidArgument'] [ 751.770430] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] [ 751.770430] env[68571]: INFO nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Terminating instance [ 751.771533] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 751.771736] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 751.771988] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2bcef398-1fb1-4671-b00c-cb8829ee45a4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.774258] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 751.774448] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 751.775211] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4bca4a2-1e15-4dd3-8221-8c01a4549564 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.782009] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 751.782232] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c2f67604-d87d-42ef-912a-d0a829364032 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.784362] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 751.784531] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 751.785489] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-6651d6a2-7345-4993-b00b-30721d6818ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.790690] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for the task: (returnval){ [ 751.790690] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b742a7-3e2d-f0c0-e664-1aa40cfbeba8" [ 751.790690] env[68571]: _type = "Task" [ 751.790690] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.799784] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b742a7-3e2d-f0c0-e664-1aa40cfbeba8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 751.854890] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 751.855133] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 751.855313] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleting the datastore file [datastore1] f3b237f4-6e23-4474-b841-aa3ca8c1486f {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 751.855578] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-d77fe233-f321-4838-be8b-879747e17a15 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 751.862444] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 751.862444] env[68571]: value = "task-3467620" [ 751.862444] env[68571]: _type = "Task" [ 751.862444] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 751.871037] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': task-3467620, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 752.302020] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 752.302020] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Creating directory with path [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 752.302020] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a05f1d6e-44ac-4a7c-ae15-1c4cd73e261b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.312711] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Created directory with path [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 752.312908] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Fetch image to [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 752.313099] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 752.313838] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b842e2c8-35f4-4163-9f2f-068b8b965176 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.320529] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2880e0b4-4309-4b29-8aea-43992fff6cd6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.329325] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-98903387-46eb-461e-b87b-4e8f86e88cb1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.359021] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7582e657-d803-409b-b8ad-cbc154ec4c01 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.367603] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-f4f7add0-5501-418a-97e7-5b8d2368ad70 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.371641] env[68571]: DEBUG oslo_vmware.api [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': task-3467620, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.085926} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 752.372156] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 752.372375] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 752.372559] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 752.372737] env[68571]: INFO nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Took 0.60 seconds to destroy the instance on the hypervisor. [ 752.374780] env[68571]: DEBUG nova.compute.claims [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 752.374955] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 752.375183] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 752.392737] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 752.445851] env[68571]: DEBUG oslo_vmware.rw_handles [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 752.503333] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.507067] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.509788] env[68571]: DEBUG oslo_vmware.rw_handles [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 752.509985] env[68571]: DEBUG oslo_vmware.rw_handles [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 752.530811] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.530811] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.530811] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 752.543035] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 752.845139] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6009309-fb86-4b7d-a5be-bc3919559a22 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.852776] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c508d585-b1f9-4bae-874f-6df0b6176fc7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.882761] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ed9ec03-1279-4879-b5ab-1c9f05c127b6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.891056] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-936e8351-f24e-435a-9650-5872924036ef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.902866] env[68571]: DEBUG nova.compute.provider_tree [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 752.911549] env[68571]: DEBUG nova.scheduler.client.report [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 752.927383] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.552s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 752.927951] env[68571]: ERROR nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.927951] env[68571]: Faults: ['InvalidArgument'] [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Traceback (most recent call last): [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self.driver.spawn(context, instance, image_meta, [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self._fetch_image_if_missing(context, vi) [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] image_cache(vi, tmp_image_ds_loc) [ 752.927951] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] vm_util.copy_virtual_disk( [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] session._wait_for_task(vmdk_copy_task) [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return self.wait_for_task(task_ref) [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return evt.wait() [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] result = hub.switch() [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] return self.greenlet.switch() [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 752.928265] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] self.f(*self.args, **self.kw) [ 752.928562] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 752.928562] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] raise exceptions.translate_fault(task_info.error) [ 752.928562] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 752.928562] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Faults: ['InvalidArgument'] [ 752.928562] env[68571]: ERROR nova.compute.manager [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] [ 752.928711] env[68571]: DEBUG nova.compute.utils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 752.929635] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.387s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 752.929856] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 752.930031] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 752.930673] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Build of instance f3b237f4-6e23-4474-b841-aa3ca8c1486f was re-scheduled: A specified parameter was not correct: fileType [ 752.930673] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 752.931072] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 752.931252] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 752.931406] env[68571]: DEBUG nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 752.931591] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 752.933680] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-705ebf7d-67fb-42aa-9ad4-79ff4a45ba5c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.944131] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f5b1400-5669-4ae5-af5d-127a54bcd01d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.959580] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45505b9c-565b-467d-921c-f0fceacf0dbe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 752.965664] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54d3709d-76e7-4a60-89c2-1190827083c4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.000741] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180940MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 753.000824] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 753.001060] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 753.074645] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.085980] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f3b237f4-6e23-4474-b841-aa3ca8c1486f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.086178] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086283] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086405] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086527] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086686] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086834] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.086953] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.087291] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 753.101084] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.113667] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0be1ddd3-e07f-49b3-a5a7-df32b5262c30 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.130039] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6894c90c-cbfb-4226-a0b5-e195f923c8e0 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.139943] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.150155] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.159468] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance db77f64d-5b6c-4a88-aa1c-2622832b3f58 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.169119] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a000e36-e100-4c79-a170-8cf86a4244d7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.181782] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47ab9428-5860-4c42-a5ec-a9ff608790e9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.196154] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a91a0cd6-a014-43c7-8723-55825c0c8662 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.209338] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6eac04c-996e-4733-a37e-d1ba61762409 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.220428] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ef53dbb8-20d3-4b5c-be29-ce75cc6c0233 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.234015] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.246811] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0d78609e-cda0-4309-af6e-7d30a939443b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.260067] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 78ce800c-1f8e-496e-9be2-24675657acb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.274538] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1ed21e6d-6b5a-4e6e-9466-b5beceda09e1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.286299] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee0a3514-6892-4ee8-bad7-9b2867ba439e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.287451] env[68571]: DEBUG nova.network.neutron [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 753.299118] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a6628de8-b7e9-466c-8cde-3f4f322c0faf has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.299895] env[68571]: INFO nova.compute.manager [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: f3b237f4-6e23-4474-b841-aa3ca8c1486f] Took 0.37 seconds to deallocate network for instance. [ 753.308878] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a350ebc-61e6-4e4d-99bc-adb67b518395 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.324879] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 95bc8fb9-032a-41d7-b925-dc9b60d21735 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.342116] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.355984] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.366719] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 394b41bd-e7f7-4a77-87d1-6777e0991d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 753.366909] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 753.367969] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 753.414471] env[68571]: INFO nova.scheduler.client.report [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleted allocations for instance f3b237f4-6e23-4474-b841-aa3ca8c1486f [ 753.441376] env[68571]: DEBUG oslo_concurrency.lockutils [None req-28aa34c1-bc67-48de-a95c-e4e0e8218050 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "f3b237f4-6e23-4474-b841-aa3ca8c1486f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.404s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 753.460558] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 753.509888] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 753.785304] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-304bfe52-4f82-44a5-b773-9354b4ea3c65 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.792979] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f6a91d2-37ff-4bc4-9206-91910c762bb4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.823028] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c98b4628-1ee1-4d1a-8c9f-18e2958299cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.828799] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a5638e0-cdfa-40d9-9233-46b74357e256 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 753.842943] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 753.851013] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 753.865482] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 753.865814] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.865s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 753.865948] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.356s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 753.867577] env[68571]: INFO nova.compute.claims [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 754.272854] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8abd486-e817-4fa0-af7e-996fb6c25616 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.277794] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d7503a3-eaff-45f4-9475-d0ce2610f58d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.306759] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48558965-f4e9-40cd-8124-2a1065c780c6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.313319] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-056d38dc-445b-4f3d-bf87-9547284056df {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.329284] env[68571]: DEBUG nova.compute.provider_tree [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 754.341552] env[68571]: DEBUG nova.scheduler.client.report [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 754.361064] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.494s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 754.361064] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 754.398158] env[68571]: DEBUG nova.compute.utils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 754.399595] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 754.399765] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 754.408511] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 754.464146] env[68571]: DEBUG nova.policy [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e6bd04ded0f74de4b760e9510fda394d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '350ecc920052486e97e1c6f7bf3ec162', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 754.479704] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 754.510790] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 754.511031] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 754.511191] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 754.511369] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 754.511828] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 754.512560] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 754.512560] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 754.512560] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 754.512560] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 754.512953] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 754.512953] env[68571]: DEBUG nova.virt.hardware [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 754.513712] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd7a29ad-04a1-4f95-bf0e-74b018804885 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.522095] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78c5068c-3772-4951-8827-1e70e35cf44d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 754.704565] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "b60eb700-434f-4bea-a84f-9071402001c3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 754.704772] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 754.829461] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.829699] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.829883] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 754.830038] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 755.013531] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Successfully created port: c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 755.255988] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.491139] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 755.491409] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 755.491484] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 755.517868] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 755.519851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 755.613505] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Successfully updated port: c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 755.630460] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 755.630460] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquired lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 755.630460] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 755.641292] env[68571]: DEBUG nova.compute.manager [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Received event network-vif-plugged-c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 755.641509] env[68571]: DEBUG oslo_concurrency.lockutils [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] Acquiring lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 755.641713] env[68571]: DEBUG oslo_concurrency.lockutils [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 755.641880] env[68571]: DEBUG oslo_concurrency.lockutils [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 755.642054] env[68571]: DEBUG nova.compute.manager [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] No waiting events found dispatching network-vif-plugged-c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 755.642214] env[68571]: WARNING nova.compute.manager [req-17e3ac6c-9a2b-4fd8-8568-49b35455b0a6 req-b86f56c4-5c8c-41bc-b24d-274f3a5fa357 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Received unexpected event network-vif-plugged-c77a8600-5ba7-48c9-852a-a39250469d52 for instance with vm_state building and task_state spawning. [ 755.702470] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 755.869289] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Updating instance_info_cache with network_info: [{"id": "c77a8600-5ba7-48c9-852a-a39250469d52", "address": "fa:16:3e:88:fe:51", "network": {"id": "7333b836-ca55-4051-a015-4027cd294f18", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1972451941-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "350ecc920052486e97e1c6f7bf3ec162", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "462a7219-4deb-4225-9cf7-3131ef280363", "external-id": "nsx-vlan-transportzone-468", "segmentation_id": 468, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc77a8600-5b", "ovs_interfaceid": "c77a8600-5ba7-48c9-852a-a39250469d52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 755.884221] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Releasing lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 755.884449] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance network_info: |[{"id": "c77a8600-5ba7-48c9-852a-a39250469d52", "address": "fa:16:3e:88:fe:51", "network": {"id": "7333b836-ca55-4051-a015-4027cd294f18", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1972451941-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "350ecc920052486e97e1c6f7bf3ec162", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "462a7219-4deb-4225-9cf7-3131ef280363", "external-id": "nsx-vlan-transportzone-468", "segmentation_id": 468, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc77a8600-5b", "ovs_interfaceid": "c77a8600-5ba7-48c9-852a-a39250469d52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 755.885332] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:88:fe:51', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '462a7219-4deb-4225-9cf7-3131ef280363', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'c77a8600-5ba7-48c9-852a-a39250469d52', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 755.892490] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Creating folder: Project (350ecc920052486e97e1c6f7bf3ec162). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 755.893019] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-54002df5-693c-4288-9f95-e67ffa46bdfa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.904456] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Created folder: Project (350ecc920052486e97e1c6f7bf3ec162) in parent group-v692787. [ 755.904637] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Creating folder: Instances. Parent ref: group-v692831. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 755.904893] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-9ec05ae6-d206-43de-ae38-14541b479cbd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.914881] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Created folder: Instances in parent group-v692831. [ 755.915222] env[68571]: DEBUG oslo.service.loopingcall [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 755.915468] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 755.915685] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d4bb213f-6974-44b5-a06c-99aec57fca7a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 755.935316] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 755.935316] env[68571]: value = "task-3467623" [ 755.935316] env[68571]: _type = "Task" [ 755.935316] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 755.945145] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467623, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 756.445217] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467623, 'name': CreateVM_Task, 'duration_secs': 0.286713} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 756.445396] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 756.446046] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 756.446220] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 756.446541] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 756.446803] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d9b8afa3-5a7d-4ca3-9550-d4c6efd414d5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 756.451117] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for the task: (returnval){ [ 756.451117] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52555043-1e79-812b-6ef2-6248dcaeb73d" [ 756.451117] env[68571]: _type = "Task" [ 756.451117] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 756.458254] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52555043-1e79-812b-6ef2-6248dcaeb73d, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 756.961686] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 756.961978] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 756.962176] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 757.736306] env[68571]: DEBUG nova.compute.manager [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Received event network-changed-c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 757.736579] env[68571]: DEBUG nova.compute.manager [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Refreshing instance network info cache due to event network-changed-c77a8600-5ba7-48c9-852a-a39250469d52. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 757.736907] env[68571]: DEBUG oslo_concurrency.lockutils [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] Acquiring lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 757.737136] env[68571]: DEBUG oslo_concurrency.lockutils [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] Acquired lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 757.737402] env[68571]: DEBUG nova.network.neutron [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Refreshing network info cache for port c77a8600-5ba7-48c9-852a-a39250469d52 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 757.993235] env[68571]: DEBUG nova.network.neutron [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Updated VIF entry in instance network info cache for port c77a8600-5ba7-48c9-852a-a39250469d52. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 757.993622] env[68571]: DEBUG nova.network.neutron [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Updating instance_info_cache with network_info: [{"id": "c77a8600-5ba7-48c9-852a-a39250469d52", "address": "fa:16:3e:88:fe:51", "network": {"id": "7333b836-ca55-4051-a015-4027cd294f18", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-1972451941-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "350ecc920052486e97e1c6f7bf3ec162", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "462a7219-4deb-4225-9cf7-3131ef280363", "external-id": "nsx-vlan-transportzone-468", "segmentation_id": 468, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapc77a8600-5b", "ovs_interfaceid": "c77a8600-5ba7-48c9-852a-a39250469d52", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 758.003645] env[68571]: DEBUG oslo_concurrency.lockutils [req-8d11ca7e-eb07-49f2-8d7d-3fca38e73747 req-b46a1a0e-ee75-4871-b399-5e210b9f7911 service nova] Releasing lock "refresh_cache-c962c9c7-04a4-46ec-a46f-fac13caa6a1e" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 760.473435] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 760.635076] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 765.113997] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 767.359435] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 769.270144] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 770.530571] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 772.028137] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.299951] env[68571]: DEBUG oslo_concurrency.lockutils [None req-115788d6-9044-4382-b713-cec406ac795e tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Acquiring lock "3986e039-9ed6-46e4-82b0-d3079bc45624" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 783.300247] env[68571]: DEBUG oslo_concurrency.lockutils [None req-115788d6-9044-4382-b713-cec406ac795e tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "3986e039-9ed6-46e4-82b0-d3079bc45624" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 788.126276] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6add77f1-94e8-4c73-ae1a-f374dd310014 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "ca22d1a8-0a38-4e91-a3e8-8d0872d2ea31" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 788.126535] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6add77f1-94e8-4c73-ae1a-f374dd310014 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "ca22d1a8-0a38-4e91-a3e8-8d0872d2ea31" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 791.881492] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2c5d925a-7912-400e-93bc-ca65894842e3 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "7f9587d5-7089-4e51-961e-88e83c573cb3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 791.881829] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2c5d925a-7912-400e-93bc-ca65894842e3 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "7f9587d5-7089-4e51-961e-88e83c573cb3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 794.368118] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d839f976-7990-4507-8f98-9b719e6b52bf tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Acquiring lock "249cf445-30fa-4de2-b09d-b8210eb3effa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 794.368957] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d839f976-7990-4507-8f98-9b719e6b52bf tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Lock "249cf445-30fa-4de2-b09d-b8210eb3effa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 798.198775] env[68571]: WARNING oslo_vmware.rw_handles [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 798.198775] env[68571]: ERROR oslo_vmware.rw_handles [ 798.198775] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 798.199449] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 798.200368] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Copying Virtual Disk [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/ea47d487-ce00-4601-9ac9-4fb86b60d0d2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 798.200860] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5e93bcd8-26de-4ec8-885e-2db7236184a3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.210222] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for the task: (returnval){ [ 798.210222] env[68571]: value = "task-3467624" [ 798.210222] env[68571]: _type = "Task" [ 798.210222] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 798.221599] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Task: {'id': task-3467624, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 798.720981] env[68571]: DEBUG oslo_vmware.exceptions [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 798.722172] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 798.722793] env[68571]: ERROR nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.722793] env[68571]: Faults: ['InvalidArgument'] [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Traceback (most recent call last): [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] yield resources [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self.driver.spawn(context, instance, image_meta, [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self._fetch_image_if_missing(context, vi) [ 798.722793] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] image_cache(vi, tmp_image_ds_loc) [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] vm_util.copy_virtual_disk( [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] session._wait_for_task(vmdk_copy_task) [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return self.wait_for_task(task_ref) [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return evt.wait() [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] result = hub.switch() [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 798.723133] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return self.greenlet.switch() [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self.f(*self.args, **self.kw) [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] raise exceptions.translate_fault(task_info.error) [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Faults: ['InvalidArgument'] [ 798.723455] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] [ 798.723455] env[68571]: INFO nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Terminating instance [ 798.725055] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 798.725180] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 798.726079] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 798.726292] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 798.726523] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-05210b20-fe35-4158-9a2f-14e5cd095580 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.730455] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ddf8fa9-808c-40ed-85e8-ebd8ff90f650 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.740629] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 798.742136] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-623ff075-29db-416d-9bbd-df6b8496a663 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.744182] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 798.744373] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 798.746434] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a2d0910c-d99c-4fec-b94b-db5f5b5469dc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.752802] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 798.752802] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52bed0f8-9b5a-80c7-e6fc-ae904a1ed455" [ 798.752802] env[68571]: _type = "Task" [ 798.752802] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 798.761349] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52bed0f8-9b5a-80c7-e6fc-ae904a1ed455, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 798.814017] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 798.814017] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 798.814017] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Deleting the datastore file [datastore1] e49a885d-c0d2-414b-b1f0-bfc3a710e9ad {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 798.814017] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-66c06e00-b9c8-4710-bb6d-7cd9566c486d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 798.820308] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for the task: (returnval){ [ 798.820308] env[68571]: value = "task-3467629" [ 798.820308] env[68571]: _type = "Task" [ 798.820308] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 798.831618] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Task: {'id': task-3467629, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 799.267102] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 799.268748] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating directory with path [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 799.269982] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c445969e-d89f-41e6-a4fb-28e9a468ca6c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.290712] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created directory with path [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 799.290873] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Fetch image to [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 799.291049] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 799.291863] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e24ae58-f27f-4c16-8fcb-c0f490146887 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.301077] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dd3a9db-a12d-4d7f-b6fd-52fe8d02bdb4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.312121] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-922bd901-4c76-4dad-919c-b2d9779c8586 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.353203] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-86d24f1e-1aad-4537-b52e-8398aa8356e1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.359883] env[68571]: DEBUG oslo_vmware.api [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Task: {'id': task-3467629, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065775} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 799.361531] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 799.362049] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 799.362049] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 799.362208] env[68571]: INFO nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Took 0.64 seconds to destroy the instance on the hypervisor. [ 799.366412] env[68571]: DEBUG nova.compute.claims [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 799.366412] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 799.366412] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 799.367134] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8b1a8508-492e-46b9-939a-52ec149f3be1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.395608] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 799.473792] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 799.543316] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 799.547506] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 799.918487] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82202812-b076-43e0-bc5a-eab60d2ee75b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.929118] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11c6ea89-a019-4d56-b64e-bec0db934b9d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.959254] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a42b50e2-9270-4435-83d8-eb15ed10291d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.968355] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-002ced0e-af4a-4310-a78d-e29c21991e97 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 799.981313] env[68571]: DEBUG nova.compute.provider_tree [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 799.992443] env[68571]: DEBUG nova.scheduler.client.report [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 800.016636] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.651s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.016824] env[68571]: ERROR nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 800.016824] env[68571]: Faults: ['InvalidArgument'] [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Traceback (most recent call last): [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self.driver.spawn(context, instance, image_meta, [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self._vmops.spawn(context, instance, image_meta, injected_files, [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self._fetch_image_if_missing(context, vi) [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] image_cache(vi, tmp_image_ds_loc) [ 800.016824] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] vm_util.copy_virtual_disk( [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] session._wait_for_task(vmdk_copy_task) [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return self.wait_for_task(task_ref) [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return evt.wait() [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] result = hub.switch() [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] return self.greenlet.switch() [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 800.017187] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] self.f(*self.args, **self.kw) [ 800.017496] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 800.017496] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] raise exceptions.translate_fault(task_info.error) [ 800.017496] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 800.017496] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Faults: ['InvalidArgument'] [ 800.017496] env[68571]: ERROR nova.compute.manager [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] [ 800.018390] env[68571]: DEBUG nova.compute.utils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 800.020248] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Build of instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad was re-scheduled: A specified parameter was not correct: fileType [ 800.020248] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 800.020248] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 800.020393] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 800.020545] env[68571]: DEBUG nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 800.020696] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 800.807458] env[68571]: DEBUG nova.network.neutron [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 800.826494] env[68571]: INFO nova.compute.manager [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Took 0.81 seconds to deallocate network for instance. [ 800.963303] env[68571]: INFO nova.scheduler.client.report [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Deleted allocations for instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad [ 800.996832] env[68571]: DEBUG oslo_concurrency.lockutils [None req-53eb5515-cc0f-45f2-8d60-f837ed56f645 tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 247.060s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 800.998727] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 45.742s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 800.998727] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Acquiring lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 800.998988] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 801.000030] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.002297] env[68571]: INFO nova.compute.manager [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Terminating instance [ 801.009189] env[68571]: DEBUG nova.compute.manager [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 801.009189] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 801.009189] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7a88e16f-1f15-4679-aa3e-53bf9595383a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.012077] env[68571]: DEBUG nova.compute.manager [None req-6153af01-7318-4b3f-a3ac-3e24e98334da tempest-ServersWithSpecificFlavorTestJSON-2116283528 tempest-ServersWithSpecificFlavorTestJSON-2116283528-project-member] [instance: 0be1ddd3-e07f-49b3-a5a7-df32b5262c30] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.031690] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-71961965-041f-4e79-b10a-73d8739cd61d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 801.076798] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e49a885d-c0d2-414b-b1f0-bfc3a710e9ad could not be found. [ 801.076798] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 801.076798] env[68571]: INFO nova.compute.manager [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Took 0.07 seconds to destroy the instance on the hypervisor. [ 801.076798] env[68571]: DEBUG oslo.service.loopingcall [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 801.077196] env[68571]: DEBUG nova.compute.manager [None req-6153af01-7318-4b3f-a3ac-3e24e98334da tempest-ServersWithSpecificFlavorTestJSON-2116283528 tempest-ServersWithSpecificFlavorTestJSON-2116283528-project-member] [instance: 0be1ddd3-e07f-49b3-a5a7-df32b5262c30] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.078702] env[68571]: DEBUG nova.compute.manager [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 801.078829] env[68571]: DEBUG nova.network.neutron [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 801.115258] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6153af01-7318-4b3f-a3ac-3e24e98334da tempest-ServersWithSpecificFlavorTestJSON-2116283528 tempest-ServersWithSpecificFlavorTestJSON-2116283528-project-member] Lock "0be1ddd3-e07f-49b3-a5a7-df32b5262c30" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.828s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.124223] env[68571]: DEBUG nova.network.neutron [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 801.135155] env[68571]: DEBUG nova.compute.manager [None req-f2c55328-3980-4dbe-bf30-b32cc1b2dd1c tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 6894c90c-cbfb-4226-a0b5-e195f923c8e0] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.145575] env[68571]: INFO nova.compute.manager [-] [instance: e49a885d-c0d2-414b-b1f0-bfc3a710e9ad] Took 0.07 seconds to deallocate network for instance. [ 801.204615] env[68571]: DEBUG nova.compute.manager [None req-f2c55328-3980-4dbe-bf30-b32cc1b2dd1c tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 6894c90c-cbfb-4226-a0b5-e195f923c8e0] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.245898] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f2c55328-3980-4dbe-bf30-b32cc1b2dd1c tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "6894c90c-cbfb-4226-a0b5-e195f923c8e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.823s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.268238] env[68571]: DEBUG nova.compute.manager [None req-37114a01-3f73-48b5-b0ca-ae97ed1a5b26 tempest-ServersV294TestFqdnHostnames-789885078 tempest-ServersV294TestFqdnHostnames-789885078-project-member] [instance: 7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.326627] env[68571]: DEBUG nova.compute.manager [None req-37114a01-3f73-48b5-b0ca-ae97ed1a5b26 tempest-ServersV294TestFqdnHostnames-789885078 tempest-ServersV294TestFqdnHostnames-789885078-project-member] [instance: 7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.328369] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dde9b6ff-3746-451f-b326-cc76441ab46c tempest-ServerExternalEventsTest-57278404 tempest-ServerExternalEventsTest-57278404-project-member] Lock "e49a885d-c0d2-414b-b1f0-bfc3a710e9ad" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.330s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.380816] env[68571]: DEBUG oslo_concurrency.lockutils [None req-37114a01-3f73-48b5-b0ca-ae97ed1a5b26 tempest-ServersV294TestFqdnHostnames-789885078 tempest-ServersV294TestFqdnHostnames-789885078-project-member] Lock "7ff5aa4a-0f8a-4ed8-a918-ef2fe3410455" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.957s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.395696] env[68571]: DEBUG nova.compute.manager [None req-10a1a04a-478e-409f-97c4-00b1e9598ad3 tempest-TenantUsagesTestJSON-1278198306 tempest-TenantUsagesTestJSON-1278198306-project-member] [instance: 52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.439259] env[68571]: DEBUG nova.compute.manager [None req-10a1a04a-478e-409f-97c4-00b1e9598ad3 tempest-TenantUsagesTestJSON-1278198306 tempest-TenantUsagesTestJSON-1278198306-project-member] [instance: 52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.468762] env[68571]: DEBUG oslo_concurrency.lockutils [None req-10a1a04a-478e-409f-97c4-00b1e9598ad3 tempest-TenantUsagesTestJSON-1278198306 tempest-TenantUsagesTestJSON-1278198306-project-member] Lock "52ef7a21-6254-4ac1-a3c0-93f1ac70dd9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.866s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.480742] env[68571]: DEBUG nova.compute.manager [None req-4d641e68-b197-4303-b234-1a916d8d0924 tempest-ServerAddressesTestJSON-1231908386 tempest-ServerAddressesTestJSON-1231908386-project-member] [instance: db77f64d-5b6c-4a88-aa1c-2622832b3f58] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.513741] env[68571]: DEBUG nova.compute.manager [None req-4d641e68-b197-4303-b234-1a916d8d0924 tempest-ServerAddressesTestJSON-1231908386 tempest-ServerAddressesTestJSON-1231908386-project-member] [instance: db77f64d-5b6c-4a88-aa1c-2622832b3f58] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.541356] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4d641e68-b197-4303-b234-1a916d8d0924 tempest-ServerAddressesTestJSON-1231908386 tempest-ServerAddressesTestJSON-1231908386-project-member] Lock "db77f64d-5b6c-4a88-aa1c-2622832b3f58" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.128s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.558184] env[68571]: DEBUG nova.compute.manager [None req-2374bcbf-15cc-48d0-af82-4deb47a498ce tempest-ImagesNegativeTestJSON-1202359250 tempest-ImagesNegativeTestJSON-1202359250-project-member] [instance: 7a000e36-e100-4c79-a170-8cf86a4244d7] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.598760] env[68571]: DEBUG nova.compute.manager [None req-2374bcbf-15cc-48d0-af82-4deb47a498ce tempest-ImagesNegativeTestJSON-1202359250 tempest-ImagesNegativeTestJSON-1202359250-project-member] [instance: 7a000e36-e100-4c79-a170-8cf86a4244d7] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.628704] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2374bcbf-15cc-48d0-af82-4deb47a498ce tempest-ImagesNegativeTestJSON-1202359250 tempest-ImagesNegativeTestJSON-1202359250-project-member] Lock "7a000e36-e100-4c79-a170-8cf86a4244d7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 213.191s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.642624] env[68571]: DEBUG nova.compute.manager [None req-f5a754fd-49aa-4f93-9ea3-91bab45f7731 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 47ab9428-5860-4c42-a5ec-a9ff608790e9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.685033] env[68571]: DEBUG nova.compute.manager [None req-f5a754fd-49aa-4f93-9ea3-91bab45f7731 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 47ab9428-5860-4c42-a5ec-a9ff608790e9] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.718684] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f5a754fd-49aa-4f93-9ea3-91bab45f7731 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "47ab9428-5860-4c42-a5ec-a9ff608790e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.165s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.732211] env[68571]: DEBUG nova.compute.manager [None req-c19cbc55-cf90-40d9-9e3a-4cfd43138761 tempest-ImagesOneServerTestJSON-2111796249 tempest-ImagesOneServerTestJSON-2111796249-project-member] [instance: a91a0cd6-a014-43c7-8723-55825c0c8662] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.761221] env[68571]: DEBUG nova.compute.manager [None req-c19cbc55-cf90-40d9-9e3a-4cfd43138761 tempest-ImagesOneServerTestJSON-2111796249 tempest-ImagesOneServerTestJSON-2111796249-project-member] [instance: a91a0cd6-a014-43c7-8723-55825c0c8662] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.796936] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c19cbc55-cf90-40d9-9e3a-4cfd43138761 tempest-ImagesOneServerTestJSON-2111796249 tempest-ImagesOneServerTestJSON-2111796249-project-member] Lock "a91a0cd6-a014-43c7-8723-55825c0c8662" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 211.592s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.808325] env[68571]: DEBUG nova.compute.manager [None req-60e62a2e-5b06-4d9f-962b-fc758ff5d907 tempest-ServerDiagnosticsTest-1371719933 tempest-ServerDiagnosticsTest-1371719933-project-member] [instance: a6eac04c-996e-4733-a37e-d1ba61762409] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.844648] env[68571]: DEBUG nova.compute.manager [None req-60e62a2e-5b06-4d9f-962b-fc758ff5d907 tempest-ServerDiagnosticsTest-1371719933 tempest-ServerDiagnosticsTest-1371719933-project-member] [instance: a6eac04c-996e-4733-a37e-d1ba61762409] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.875731] env[68571]: DEBUG oslo_concurrency.lockutils [None req-60e62a2e-5b06-4d9f-962b-fc758ff5d907 tempest-ServerDiagnosticsTest-1371719933 tempest-ServerDiagnosticsTest-1371719933-project-member] Lock "a6eac04c-996e-4733-a37e-d1ba61762409" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 204.874s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.889792] env[68571]: DEBUG nova.compute.manager [None req-9db9e5d0-852f-4143-bea9-b8e6bca60028 tempest-ServerMetadataTestJSON-517122474 tempest-ServerMetadataTestJSON-517122474-project-member] [instance: ef53dbb8-20d3-4b5c-be29-ce75cc6c0233] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 801.924972] env[68571]: DEBUG nova.compute.manager [None req-9db9e5d0-852f-4143-bea9-b8e6bca60028 tempest-ServerMetadataTestJSON-517122474 tempest-ServerMetadataTestJSON-517122474-project-member] [instance: ef53dbb8-20d3-4b5c-be29-ce75cc6c0233] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 801.955362] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9db9e5d0-852f-4143-bea9-b8e6bca60028 tempest-ServerMetadataTestJSON-517122474 tempest-ServerMetadataTestJSON-517122474-project-member] Lock "ef53dbb8-20d3-4b5c-be29-ce75cc6c0233" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.228s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 801.970348] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 802.044518] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 802.044729] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 802.047050] env[68571]: INFO nova.compute.claims [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 802.551565] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4b2d663-7beb-4d09-97ff-cd3b376ceacb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.561568] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af721a93-17fd-4d3c-b7b0-1a643a5d0fc8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.594062] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-69b0d41a-5b3d-42be-8187-b37340589f3d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.601680] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9211b28-4e5d-412f-9517-f9b818ff8c66 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.615347] env[68571]: DEBUG nova.compute.provider_tree [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 802.627023] env[68571]: DEBUG nova.scheduler.client.report [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 802.646303] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.598s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 802.646303] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 802.689340] env[68571]: DEBUG nova.compute.utils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 802.692335] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 802.693625] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 802.704690] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 802.779302] env[68571]: DEBUG nova.policy [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd5740e66ac4d404e9f3d0800a0c3031a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '113f621758c04b3ebb193c8148594123', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 802.787835] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 802.819538] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 802.821442] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 802.821442] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 802.821442] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 802.821442] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 802.821442] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 802.821758] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 802.821758] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 802.821758] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 802.821758] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 802.821891] env[68571]: DEBUG nova.virt.hardware [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 802.822135] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96eee094-da6a-4c9a-bdd5-82aed725f180 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 802.831024] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20a3e8e0-76d7-4a78-977b-511495069923 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 803.089503] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 803.187281] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Successfully created port: bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 803.807466] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 803.807722] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 804.026980] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Successfully updated port: bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 804.043469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 804.043674] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquired lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 804.043834] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 804.091941] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 804.289875] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updating instance_info_cache with network_info: [{"id": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "address": "fa:16:3e:8e:99:91", "network": {"id": "b14e3442-9e0a-4819-9bbc-30331fc0eb82", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-955615954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "113f621758c04b3ebb193c8148594123", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5514c5a3-1294-40ad-ae96-29d5c24a3d95", "external-id": "nsx-vlan-transportzone-179", "segmentation_id": 179, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdd321b8-bb", "ovs_interfaceid": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 804.309523] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Releasing lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 804.309898] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance network_info: |[{"id": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "address": "fa:16:3e:8e:99:91", "network": {"id": "b14e3442-9e0a-4819-9bbc-30331fc0eb82", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-955615954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "113f621758c04b3ebb193c8148594123", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5514c5a3-1294-40ad-ae96-29d5c24a3d95", "external-id": "nsx-vlan-transportzone-179", "segmentation_id": 179, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdd321b8-bb", "ovs_interfaceid": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 804.310893] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:8e:99:91', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '5514c5a3-1294-40ad-ae96-29d5c24a3d95', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bdd321b8-bb54-45f8-a0b0-bc78a26175fb', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 804.320587] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Creating folder: Project (113f621758c04b3ebb193c8148594123). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.321344] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e95d092d-1e79-4885-9566-a08c716db071 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.342238] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Created folder: Project (113f621758c04b3ebb193c8148594123) in parent group-v692787. [ 804.342238] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Creating folder: Instances. Parent ref: group-v692837. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 804.342238] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-6712476c-3f1c-4399-b88c-819e3e7918b4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.350778] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Created folder: Instances in parent group-v692837. [ 804.351035] env[68571]: DEBUG oslo.service.loopingcall [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 804.351551] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 804.351867] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8c90b067-ff31-4273-af9a-d74fd267ad0e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.372118] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 804.372118] env[68571]: value = "task-3467634" [ 804.372118] env[68571]: _type = "Task" [ 804.372118] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 804.382602] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467634, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 804.882848] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467634, 'name': CreateVM_Task, 'duration_secs': 0.39114} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 804.883036] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 804.883704] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 804.883869] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 804.884203] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 804.884464] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b1e92617-f98b-48e5-b30d-894ab51938e8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 804.889105] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for the task: (returnval){ [ 804.889105] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5256a7a4-8a13-2a63-954d-7c9348889172" [ 804.889105] env[68571]: _type = "Task" [ 804.889105] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 804.896525] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5256a7a4-8a13-2a63-954d-7c9348889172, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 805.051410] env[68571]: DEBUG nova.compute.manager [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Received event network-vif-plugged-bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 805.051688] env[68571]: DEBUG oslo_concurrency.lockutils [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] Acquiring lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 805.051850] env[68571]: DEBUG oslo_concurrency.lockutils [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 805.052117] env[68571]: DEBUG oslo_concurrency.lockutils [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 805.052349] env[68571]: DEBUG nova.compute.manager [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] No waiting events found dispatching network-vif-plugged-bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 805.052713] env[68571]: WARNING nova.compute.manager [req-2df61954-ddba-48b9-9a14-b69a58288290 req-27446718-3f9b-4226-9e28-19967c384ae4 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Received unexpected event network-vif-plugged-bdd321b8-bb54-45f8-a0b0-bc78a26175fb for instance with vm_state building and task_state deleting. [ 805.401304] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 805.401550] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 805.401756] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 807.992653] env[68571]: DEBUG nova.compute.manager [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Received event network-changed-bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 807.992950] env[68571]: DEBUG nova.compute.manager [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Refreshing instance network info cache due to event network-changed-bdd321b8-bb54-45f8-a0b0-bc78a26175fb. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 807.993435] env[68571]: DEBUG oslo_concurrency.lockutils [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] Acquiring lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 807.993691] env[68571]: DEBUG oslo_concurrency.lockutils [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] Acquired lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 807.993875] env[68571]: DEBUG nova.network.neutron [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Refreshing network info cache for port bdd321b8-bb54-45f8-a0b0-bc78a26175fb {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 808.602826] env[68571]: DEBUG nova.network.neutron [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updated VIF entry in instance network info cache for port bdd321b8-bb54-45f8-a0b0-bc78a26175fb. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 808.602826] env[68571]: DEBUG nova.network.neutron [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updating instance_info_cache with network_info: [{"id": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "address": "fa:16:3e:8e:99:91", "network": {"id": "b14e3442-9e0a-4819-9bbc-30331fc0eb82", "bridge": "br-int", "label": "tempest-InstanceActionsNegativeTestJSON-955615954-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "113f621758c04b3ebb193c8148594123", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "5514c5a3-1294-40ad-ae96-29d5c24a3d95", "external-id": "nsx-vlan-transportzone-179", "segmentation_id": 179, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbdd321b8-bb", "ovs_interfaceid": "bdd321b8-bb54-45f8-a0b0-bc78a26175fb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 808.612763] env[68571]: DEBUG oslo_concurrency.lockutils [req-e4a2e03b-1188-458a-a639-efabaf573954 req-842a88e2-a109-476d-9f6f-621773c132b6 service nova] Releasing lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 809.127254] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43ac91dd-1e2a-4c19-a6eb-b9668c6993b6 tempest-FloatingIPsAssociationNegativeTestJSON-559490589 tempest-FloatingIPsAssociationNegativeTestJSON-559490589-project-member] Acquiring lock "18849294-d11e-40ed-9c2a-7706f7409d9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 809.127587] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43ac91dd-1e2a-4c19-a6eb-b9668c6993b6 tempest-FloatingIPsAssociationNegativeTestJSON-559490589 tempest-FloatingIPsAssociationNegativeTestJSON-559490589-project-member] Lock "18849294-d11e-40ed-9c2a-7706f7409d9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 809.489351] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 809.489571] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 809.510796] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 1 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 809.511929] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ccd1b692-f511-43c8-8b3d-ce92ef27670f] Instance has had 0 of 5 cleanup attempts {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11211}} [ 809.561591] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 809.565138] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 809.574306] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 810.582759] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 811.418150] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5ec4357c-b59a-488c-a1a6-459deb51b113 tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Acquiring lock "1df720d6-655c-49b6-a65d-d56b757143a7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 811.418150] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5ec4357c-b59a-488c-a1a6-459deb51b113 tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "1df720d6-655c-49b6-a65d-d56b757143a7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 812.489429] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 812.489754] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 813.488946] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 813.489256] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 813.508937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.508937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.508937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 813.508937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 813.508937] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c34dc06f-e3f5-45b6-ba27-25ce9c057e08 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.517642] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f068d72-419f-4295-b096-ccc065dee33a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.532312] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4cff928c-b6a9-4aa5-a0d2-2728dfd80df4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.539229] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-43700073-5f32-40f6-a55e-9df18f1fdf27 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 813.571253] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 813.571253] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 813.571253] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 813.769174] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769174] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769174] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769174] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769577] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769633] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769720] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.769862] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.770115] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.770202] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 813.784179] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 95bc8fb9-032a-41d7-b925-dc9b60d21735 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.795573] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.809358] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.821647] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 394b41bd-e7f7-4a77-87d1-6777e0991d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.833929] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.845208] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3986e039-9ed6-46e4-82b0-d3079bc45624 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.858418] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7f9587d5-7089-4e51-961e-88e83c573cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.872029] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 249cf445-30fa-4de2-b09d-b8210eb3effa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.884500] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.895929] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 18849294-d11e-40ed-9c2a-7706f7409d9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.906823] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1df720d6-655c-49b6-a65d-d56b757143a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 813.907161] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 813.907324] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 813.930443] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 813.955121] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 813.955372] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 813.972885] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 813.993989] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 814.331576] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fb9c51a2-6cbe-46a9-b129-0f1ad6215852 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.340288] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4e02cc7-12df-4d5a-8aac-69ae624786b1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.371185] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55fec87a-441d-444c-9a23-5c3a06b1e434 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.376513] env[68571]: DEBUG oslo_concurrency.lockutils [None req-418088bb-dc24-4b40-aeb7-a26e72151b8d tempest-ServerActionsTestOtherA-263171523 tempest-ServerActionsTestOtherA-263171523-project-member] Acquiring lock "2a80d267-c2f0-4745-b23b-24717e4d9531" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 814.377262] env[68571]: DEBUG oslo_concurrency.lockutils [None req-418088bb-dc24-4b40-aeb7-a26e72151b8d tempest-ServerActionsTestOtherA-263171523 tempest-ServerActionsTestOtherA-263171523-project-member] Lock "2a80d267-c2f0-4745-b23b-24717e4d9531" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 814.381956] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16f8f5c1-6403-41a9-9f80-a49c35d20285 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 814.396252] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 814.406538] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 814.425418] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 814.425658] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 815.421043] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.489305] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.489512] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 815.489625] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 815.514254] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.514426] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.514559] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.514889] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515113] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515195] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515325] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515636] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515636] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515766] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 815.515766] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 815.516303] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.516480] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 815.516752] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 816.287852] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Acquiring lock "466616ca-0cad-4561-b0d6-1e34e3243418" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.288118] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "466616ca-0cad-4561-b0d6-1e34e3243418" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 816.310887] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Acquiring lock "4bc7288c-3483-46ed-9c4f-673f86b10446" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 816.311500] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "4bc7288c-3483-46ed-9c4f-673f86b10446" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 817.378895] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ecc40a7d-e38c-4d8a-b590-07d74b57e720 tempest-ServersTestBootFromVolume-614723402 tempest-ServersTestBootFromVolume-614723402-project-member] Acquiring lock "d910cc12-8da2-4ce6-9107-a54c870405de" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 817.379286] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ecc40a7d-e38c-4d8a-b590-07d74b57e720 tempest-ServersTestBootFromVolume-614723402 tempest-ServersTestBootFromVolume-614723402-project-member] Lock "d910cc12-8da2-4ce6-9107-a54c870405de" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 822.488992] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3e1e414-f2fb-4ac5-b66d-778a4434410c tempest-ServerActionsTestOtherB-126684617 tempest-ServerActionsTestOtherB-126684617-project-member] Acquiring lock "d5416006-57fd-4966-90d2-3ba18d3eceba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 822.489284] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3e1e414-f2fb-4ac5-b66d-778a4434410c tempest-ServerActionsTestOtherB-126684617 tempest-ServerActionsTestOtherB-126684617-project-member] Lock "d5416006-57fd-4966-90d2-3ba18d3eceba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 826.562904] env[68571]: DEBUG oslo_concurrency.lockutils [None req-766f1b84-25ef-4273-8acd-5c0cd23d89bb tempest-ServersTestManualDisk-226199740 tempest-ServersTestManualDisk-226199740-project-member] Acquiring lock "812bbf03-e2c0-4827-8ed9-cc60611a77ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 826.563239] env[68571]: DEBUG oslo_concurrency.lockutils [None req-766f1b84-25ef-4273-8acd-5c0cd23d89bb tempest-ServersTestManualDisk-226199740 tempest-ServersTestManualDisk-226199740-project-member] Lock "812bbf03-e2c0-4827-8ed9-cc60611a77ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 831.066876] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a1fc207-4e8b-4e1e-b831-20b107b2b88f tempest-ServerPasswordTestJSON-1255952750 tempest-ServerPasswordTestJSON-1255952750-project-member] Acquiring lock "4c70fd0e-9872-423e-8b7e-4c17760d88bc" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 831.066876] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a1fc207-4e8b-4e1e-b831-20b107b2b88f tempest-ServerPasswordTestJSON-1255952750 tempest-ServerPasswordTestJSON-1255952750-project-member] Lock "4c70fd0e-9872-423e-8b7e-4c17760d88bc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 847.078350] env[68571]: WARNING oslo_vmware.rw_handles [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 847.078350] env[68571]: ERROR oslo_vmware.rw_handles [ 847.079065] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 847.080980] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 847.081271] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Copying Virtual Disk [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/1fd6ef38-dcd1-4628-aa2c-8e97a9a5b736/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 847.081612] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-c752839d-9097-49c5-aa93-83d1bb5cd61b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.089500] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 847.089500] env[68571]: value = "task-3467640" [ 847.089500] env[68571]: _type = "Task" [ 847.089500] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.097600] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467640, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 847.604024] env[68571]: DEBUG oslo_vmware.exceptions [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 847.604024] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 847.604024] env[68571]: ERROR nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.604024] env[68571]: Faults: ['InvalidArgument'] [ 847.604024] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Traceback (most recent call last): [ 847.604024] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 847.604024] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] yield resources [ 847.604024] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 847.604024] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self.driver.spawn(context, instance, image_meta, [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self._fetch_image_if_missing(context, vi) [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] image_cache(vi, tmp_image_ds_loc) [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] vm_util.copy_virtual_disk( [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] session._wait_for_task(vmdk_copy_task) [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return self.wait_for_task(task_ref) [ 847.604436] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return evt.wait() [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] result = hub.switch() [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return self.greenlet.switch() [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self.f(*self.args, **self.kw) [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] raise exceptions.translate_fault(task_info.error) [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Faults: ['InvalidArgument'] [ 847.604852] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] [ 847.605330] env[68571]: INFO nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Terminating instance [ 847.605330] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 847.605330] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 847.605330] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6d9daf75-308f-4e30-a4f1-57408695302e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.607970] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 847.608317] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 847.609166] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b2bbaec-cd22-45c5-9a36-74047c00758b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.617090] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 847.617090] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1f6d35b9-a61f-451f-be39-dd58210a4d18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.619301] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 847.619712] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 847.620447] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c8d44cf-0705-4f3b-8c40-b01205ea3db6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.627023] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 847.627023] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]524d2a82-4ad0-3cde-7526-d35a7e816ff1" [ 847.627023] env[68571]: _type = "Task" [ 847.627023] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.632634] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]524d2a82-4ad0-3cde-7526-d35a7e816ff1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 847.697227] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 847.697466] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 847.697785] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleting the datastore file [datastore1] 0eae5e9a-258a-44e5-9b4f-53100f15aa7a {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 847.698074] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a3166321-ccbf-4027-be58-ca099d3b7ba7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 847.704112] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 847.704112] env[68571]: value = "task-3467642" [ 847.704112] env[68571]: _type = "Task" [ 847.704112] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 847.711679] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467642, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 848.137591] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 848.137893] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating directory with path [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 848.138182] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-61e00b05-d8b5-4571-b6b6-ec83df14b461 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.154384] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created directory with path [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 848.154674] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Fetch image to [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 848.154911] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 848.155728] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68673093-c5b9-4ad8-b3f3-72545f8195a9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.162270] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa523c90-f182-4b88-9230-503a4a80e004 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.171299] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-203f823d-718f-45f5-8d8d-d6e67cfe35a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.203335] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6fec40a-a97a-4ce1-b558-705fc006e8cb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.214551] env[68571]: DEBUG oslo_vmware.api [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467642, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063395} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 848.215158] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 848.215416] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 848.215662] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 848.215901] env[68571]: INFO nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Took 0.61 seconds to destroy the instance on the hypervisor. [ 848.217473] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-972a3c89-1dca-425d-b935-97615d3b6e1b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.219595] env[68571]: DEBUG nova.compute.claims [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 848.219854] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 848.220149] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 848.247053] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 848.302446] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 848.362286] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 848.362481] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 848.636972] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1afc7fe8-e845-486d-8a28-58117ac67bb4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.644196] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07beba30-9dc9-4746-a944-0d5112e67e9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.673103] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52d46fa6-473e-471c-8f3a-bf1072fbec6d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.680297] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47087005-3cd9-411b-ad88-1484ed12ed2b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 848.693918] env[68571]: DEBUG nova.compute.provider_tree [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 848.703418] env[68571]: DEBUG nova.scheduler.client.report [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 848.717225] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.497s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 848.717748] env[68571]: ERROR nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.717748] env[68571]: Faults: ['InvalidArgument'] [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Traceback (most recent call last): [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self.driver.spawn(context, instance, image_meta, [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self._vmops.spawn(context, instance, image_meta, injected_files, [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self._fetch_image_if_missing(context, vi) [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] image_cache(vi, tmp_image_ds_loc) [ 848.717748] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] vm_util.copy_virtual_disk( [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] session._wait_for_task(vmdk_copy_task) [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return self.wait_for_task(task_ref) [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return evt.wait() [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] result = hub.switch() [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] return self.greenlet.switch() [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 848.718142] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] self.f(*self.args, **self.kw) [ 848.718482] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 848.718482] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] raise exceptions.translate_fault(task_info.error) [ 848.718482] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 848.718482] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Faults: ['InvalidArgument'] [ 848.718482] env[68571]: ERROR nova.compute.manager [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] [ 848.718482] env[68571]: DEBUG nova.compute.utils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 848.719843] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Build of instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a was re-scheduled: A specified parameter was not correct: fileType [ 848.719843] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 848.720231] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 848.720402] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 848.720553] env[68571]: DEBUG nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 848.720732] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 849.039398] env[68571]: DEBUG nova.network.neutron [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.050869] env[68571]: INFO nova.compute.manager [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Took 0.33 seconds to deallocate network for instance. [ 849.168604] env[68571]: INFO nova.scheduler.client.report [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleted allocations for instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a [ 849.194532] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a2661521-87d7-4e9f-b367-532b6fdf930d tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 298.670s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.195679] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 97.639s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.195910] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.196185] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.198160] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.199415] env[68571]: INFO nova.compute.manager [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Terminating instance [ 849.201598] env[68571]: DEBUG nova.compute.manager [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 849.201798] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 849.202287] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5cadc13b-0ff5-4c15-8947-8c16a3d5eaf0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.211297] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9974d8a9-98ed-4437-b682-616777496954 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 849.222567] env[68571]: DEBUG nova.compute.manager [None req-8b703ec1-2793-4846-a6f0-e3fdce5e72cf tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 78ce800c-1f8e-496e-9be2-24675657acb2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.242717] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 0eae5e9a-258a-44e5-9b4f-53100f15aa7a could not be found. [ 849.243317] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 849.243317] env[68571]: INFO nova.compute.manager [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Took 0.04 seconds to destroy the instance on the hypervisor. [ 849.243529] env[68571]: DEBUG oslo.service.loopingcall [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 849.243671] env[68571]: DEBUG nova.compute.manager [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 849.243671] env[68571]: DEBUG nova.network.neutron [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 849.252474] env[68571]: DEBUG nova.compute.manager [None req-8b703ec1-2793-4846-a6f0-e3fdce5e72cf tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 78ce800c-1f8e-496e-9be2-24675657acb2] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.267125] env[68571]: DEBUG nova.network.neutron [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 849.278051] env[68571]: INFO nova.compute.manager [-] [instance: 0eae5e9a-258a-44e5-9b4f-53100f15aa7a] Took 0.03 seconds to deallocate network for instance. [ 849.280568] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8b703ec1-2793-4846-a6f0-e3fdce5e72cf tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "78ce800c-1f8e-496e-9be2-24675657acb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.462s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.291276] env[68571]: DEBUG nova.compute.manager [None req-9a5e537f-c469-42f2-8f69-b716684f0e52 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] [instance: 0d78609e-cda0-4309-af6e-7d30a939443b] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.323933] env[68571]: DEBUG nova.compute.manager [None req-9a5e537f-c469-42f2-8f69-b716684f0e52 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] [instance: 0d78609e-cda0-4309-af6e-7d30a939443b] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.348164] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9a5e537f-c469-42f2-8f69-b716684f0e52 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Lock "0d78609e-cda0-4309-af6e-7d30a939443b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 242.529s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.357807] env[68571]: DEBUG nova.compute.manager [None req-d06bdd6f-c5c0-4aaf-b5a9-d7a41ea67cb8 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] [instance: 1ed21e6d-6b5a-4e6e-9466-b5beceda09e1] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.386650] env[68571]: DEBUG nova.compute.manager [None req-d06bdd6f-c5c0-4aaf-b5a9-d7a41ea67cb8 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] [instance: 1ed21e6d-6b5a-4e6e-9466-b5beceda09e1] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.409817] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d734978e-ae98-4686-9f84-9ad4ffcd4226 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "0eae5e9a-258a-44e5-9b4f-53100f15aa7a" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.214s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.418078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d06bdd6f-c5c0-4aaf-b5a9-d7a41ea67cb8 tempest-ServerRescueNegativeTestJSON-1109921790 tempest-ServerRescueNegativeTestJSON-1109921790-project-member] Lock "1ed21e6d-6b5a-4e6e-9466-b5beceda09e1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 239.797s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.426493] env[68571]: DEBUG nova.compute.manager [None req-a73180a1-613c-42ab-8de9-a93f490170e4 tempest-ServersTestFqdnHostnames-1143824028 tempest-ServersTestFqdnHostnames-1143824028-project-member] [instance: ee0a3514-6892-4ee8-bad7-9b2867ba439e] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.449578] env[68571]: DEBUG nova.compute.manager [None req-a73180a1-613c-42ab-8de9-a93f490170e4 tempest-ServersTestFqdnHostnames-1143824028 tempest-ServersTestFqdnHostnames-1143824028-project-member] [instance: ee0a3514-6892-4ee8-bad7-9b2867ba439e] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.469627] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a73180a1-613c-42ab-8de9-a93f490170e4 tempest-ServersTestFqdnHostnames-1143824028 tempest-ServersTestFqdnHostnames-1143824028-project-member] Lock "ee0a3514-6892-4ee8-bad7-9b2867ba439e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 237.318s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.479356] env[68571]: DEBUG nova.compute.manager [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: a6628de8-b7e9-466c-8cde-3f4f322c0faf] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.504863] env[68571]: DEBUG nova.compute.manager [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: a6628de8-b7e9-466c-8cde-3f4f322c0faf] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.525318] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "a6628de8-b7e9-466c-8cde-3f4f322c0faf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.173s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.533914] env[68571]: DEBUG nova.compute.manager [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 7a350ebc-61e6-4e4d-99bc-adb67b518395] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.556362] env[68571]: DEBUG nova.compute.manager [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 7a350ebc-61e6-4e4d-99bc-adb67b518395] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.579493] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0bc870a8-7e05-4d7f-bd1d-ddd77a44cd6b tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "7a350ebc-61e6-4e4d-99bc-adb67b518395" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 235.200s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.588317] env[68571]: DEBUG nova.compute.manager [None req-b0a08811-0ed4-44fb-91fc-7cbafef40caf tempest-InstanceActionsV221TestJSON-79637547 tempest-InstanceActionsV221TestJSON-79637547-project-member] [instance: 95bc8fb9-032a-41d7-b925-dc9b60d21735] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.610811] env[68571]: DEBUG nova.compute.manager [None req-b0a08811-0ed4-44fb-91fc-7cbafef40caf tempest-InstanceActionsV221TestJSON-79637547 tempest-InstanceActionsV221TestJSON-79637547-project-member] [instance: 95bc8fb9-032a-41d7-b925-dc9b60d21735] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 849.634144] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b0a08811-0ed4-44fb-91fc-7cbafef40caf tempest-InstanceActionsV221TestJSON-79637547 tempest-InstanceActionsV221TestJSON-79637547-project-member] Lock "95bc8fb9-032a-41d7-b925-dc9b60d21735" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 222.840s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 849.642741] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 849.691655] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 849.691890] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 849.693331] env[68571]: INFO nova.compute.claims [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 850.032958] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d13c6ee4-3e21-4f55-8a33-80924920e6bf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.040896] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-103686b6-1eb0-4cbd-8465-eb9d2966e6c2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.074726] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5ef2a68-c825-4a57-8943-9292b24d2b19 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.083395] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b266e9b4-d32d-4e61-b8c1-b7f46e1c4d50 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.097323] env[68571]: DEBUG nova.compute.provider_tree [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 850.106051] env[68571]: DEBUG nova.scheduler.client.report [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 850.119648] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.428s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 850.120234] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 850.155190] env[68571]: DEBUG nova.compute.utils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 850.156892] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 850.157589] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 850.167520] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 850.224966] env[68571]: DEBUG nova.policy [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '659fc6ccf1c44f2f858cf6b6d7311502', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '11a9da07685d44659320e7fd0780d36a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 850.238184] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 850.265036] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 850.265281] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 850.265344] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 850.265522] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 850.265670] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 850.265817] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 850.266029] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 850.266194] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 850.266359] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 850.266522] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 850.266696] env[68571]: DEBUG nova.virt.hardware [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 850.267542] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4e43c2f-04aa-4671-af3c-213965cfaef0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.275809] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ff0460f-f2ab-4861-8f51-45944e4ec61f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 850.517498] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Successfully created port: 3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 850.806983] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 850.806983] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 851.272488] env[68571]: DEBUG nova.compute.manager [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Received event network-vif-plugged-3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 851.272771] env[68571]: DEBUG oslo_concurrency.lockutils [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] Acquiring lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 851.272883] env[68571]: DEBUG oslo_concurrency.lockutils [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] Lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 851.273095] env[68571]: DEBUG oslo_concurrency.lockutils [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] Lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 851.273229] env[68571]: DEBUG nova.compute.manager [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] No waiting events found dispatching network-vif-plugged-3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 851.274099] env[68571]: WARNING nova.compute.manager [req-4616e2ec-323e-4361-85dd-ab6954e6eefa req-f18cba1e-58a7-4de9-9464-0c77ef9ce840 service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Received unexpected event network-vif-plugged-3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 for instance with vm_state building and task_state spawning. [ 851.332942] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Successfully updated port: 3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 851.347540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 851.347685] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquired lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 851.347845] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 851.389642] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 851.577982] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Updating instance_info_cache with network_info: [{"id": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "address": "fa:16:3e:79:12:b4", "network": {"id": "7e2ff714-565e-4e62-a5a4-792dbe34ec81", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-803180871-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "11a9da07685d44659320e7fd0780d36a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "37333dc2-982e-45e9-9dda-0c18417d7fa6", "external-id": "nsx-vlan-transportzone-227", "segmentation_id": 227, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3183ba8a-31", "ovs_interfaceid": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 851.593171] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Releasing lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 851.593472] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance network_info: |[{"id": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "address": "fa:16:3e:79:12:b4", "network": {"id": "7e2ff714-565e-4e62-a5a4-792dbe34ec81", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-803180871-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "11a9da07685d44659320e7fd0780d36a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "37333dc2-982e-45e9-9dda-0c18417d7fa6", "external-id": "nsx-vlan-transportzone-227", "segmentation_id": 227, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3183ba8a-31", "ovs_interfaceid": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 851.593873] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:79:12:b4', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '37333dc2-982e-45e9-9dda-0c18417d7fa6', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '3183ba8a-3150-4eb3-9cbe-d8e4c7f98665', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 851.601345] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Creating folder: Project (11a9da07685d44659320e7fd0780d36a). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.601878] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-d6cd1465-3c2e-4167-8c50-ed0998953af8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.612993] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Created folder: Project (11a9da07685d44659320e7fd0780d36a) in parent group-v692787. [ 851.613202] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Creating folder: Instances. Parent ref: group-v692841. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 851.613422] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c73c9ec2-4acb-4c3c-9d21-78a005d6bc49 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.621350] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Created folder: Instances in parent group-v692841. [ 851.621575] env[68571]: DEBUG oslo.service.loopingcall [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 851.621747] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 851.621942] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-8c9b4f0d-43c4-46f5-95f8-4a897c4afa6b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 851.642024] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 851.642024] env[68571]: value = "task-3467645" [ 851.642024] env[68571]: _type = "Task" [ 851.642024] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 851.648633] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467645, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 852.151599] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467645, 'name': CreateVM_Task, 'duration_secs': 0.284417} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 852.151781] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 852.152459] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 852.152621] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 852.152967] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 852.153233] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ded4dbe7-688a-4316-b122-f914cc58065a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 852.158158] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for the task: (returnval){ [ 852.158158] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5239166b-d4c3-a370-4700-16fcdea935be" [ 852.158158] env[68571]: _type = "Task" [ 852.158158] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 852.165964] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5239166b-d4c3-a370-4700-16fcdea935be, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 852.668128] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 852.668418] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 852.668563] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 853.416671] env[68571]: DEBUG nova.compute.manager [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Received event network-changed-3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 853.416827] env[68571]: DEBUG nova.compute.manager [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Refreshing instance network info cache due to event network-changed-3183ba8a-3150-4eb3-9cbe-d8e4c7f98665. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 853.417443] env[68571]: DEBUG oslo_concurrency.lockutils [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] Acquiring lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 853.417731] env[68571]: DEBUG oslo_concurrency.lockutils [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] Acquired lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 853.417970] env[68571]: DEBUG nova.network.neutron [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Refreshing network info cache for port 3183ba8a-3150-4eb3-9cbe-d8e4c7f98665 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 853.703979] env[68571]: DEBUG nova.network.neutron [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Updated VIF entry in instance network info cache for port 3183ba8a-3150-4eb3-9cbe-d8e4c7f98665. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 853.704344] env[68571]: DEBUG nova.network.neutron [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Updating instance_info_cache with network_info: [{"id": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "address": "fa:16:3e:79:12:b4", "network": {"id": "7e2ff714-565e-4e62-a5a4-792dbe34ec81", "bridge": "br-int", "label": "tempest-AttachInterfacesV270Test-803180871-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "11a9da07685d44659320e7fd0780d36a", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "37333dc2-982e-45e9-9dda-0c18417d7fa6", "external-id": "nsx-vlan-transportzone-227", "segmentation_id": 227, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap3183ba8a-31", "ovs_interfaceid": "3183ba8a-3150-4eb3-9cbe-d8e4c7f98665", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 853.714513] env[68571]: DEBUG oslo_concurrency.lockutils [req-b5c73c87-6fee-4865-ba55-1965ec72edc5 req-2252c113-f2ab-4990-b7ae-a9cb81166d0d service nova] Releasing lock "refresh_cache-244ba708-279e-440e-bc18-8c6ee7b83250" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 859.143656] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "244ba708-279e-440e-bc18-8c6ee7b83250" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 871.491797] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 872.489642] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 873.489800] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 874.489585] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 874.503020] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.503299] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 874.503445] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 874.503591] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 874.504718] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d7039d7-bd8e-46b3-85f9-c8781d0d60f0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.513884] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8408dc0c-34e6-4f55-8d02-f6613c13460f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.527604] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21b6fc89-9dfe-442a-a024-11accc0aeaf0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.533572] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c5ae51ea-4f81-4a53-bd28-71bc42c1ea57 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 874.561809] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180938MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 874.561956] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 874.562171] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 874.638039] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638039] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638039] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638039] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638202] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638202] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638202] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638202] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638323] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.638323] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 874.649537] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.659692] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 394b41bd-e7f7-4a77-87d1-6777e0991d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.668937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.677931] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3986e039-9ed6-46e4-82b0-d3079bc45624 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.690514] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7f9587d5-7089-4e51-961e-88e83c573cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.700318] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 249cf445-30fa-4de2-b09d-b8210eb3effa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.709664] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.719427] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 18849294-d11e-40ed-9c2a-7706f7409d9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.730685] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1df720d6-655c-49b6-a65d-d56b757143a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.739770] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2a80d267-c2f0-4745-b23b-24717e4d9531 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.749679] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466616ca-0cad-4561-b0d6-1e34e3243418 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.758971] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4bc7288c-3483-46ed-9c4f-673f86b10446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.768065] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d910cc12-8da2-4ce6-9107-a54c870405de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.777327] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d5416006-57fd-4966-90d2-3ba18d3eceba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.786739] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 812bbf03-e2c0-4827-8ed9-cc60611a77ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.796418] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4c70fd0e-9872-423e-8b7e-4c17760d88bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.805675] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 874.806019] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 874.806098] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 875.098829] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dbb54874-3b19-493a-a0ce-d5863aecd72b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.106049] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed663ac9-5f23-4849-9170-cda9fb6b4bb4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.134914] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65c3db93-93b9-4de0-bd44-6a5d1f5ff56e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.141904] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3320a46-5ea6-4394-8115-fb9a1231403b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 875.156640] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 875.166453] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 875.185573] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 875.185766] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.624s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 876.180603] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.180916] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.201249] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.488954] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 876.489214] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 876.489356] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 876.509539] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.509687] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.509821] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510010] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510324] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510458] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510587] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510727] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510848] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.510965] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 876.511158] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 877.488997] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.489351] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 877.489351] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 897.082863] env[68571]: WARNING oslo_vmware.rw_handles [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 897.082863] env[68571]: ERROR oslo_vmware.rw_handles [ 897.083527] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 897.085069] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 897.085322] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Copying Virtual Disk [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/46ed3801-046c-484b-9b1a-9386afe66787/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 897.085607] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6ecdff9f-2a4f-432f-88b8-4d39bb101b16 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.093883] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 897.093883] env[68571]: value = "task-3467646" [ 897.093883] env[68571]: _type = "Task" [ 897.093883] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 897.101905] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467646, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 897.605104] env[68571]: DEBUG oslo_vmware.exceptions [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 897.605104] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 897.605468] env[68571]: ERROR nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.605468] env[68571]: Faults: ['InvalidArgument'] [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Traceback (most recent call last): [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] yield resources [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self.driver.spawn(context, instance, image_meta, [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self._fetch_image_if_missing(context, vi) [ 897.605468] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] image_cache(vi, tmp_image_ds_loc) [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] vm_util.copy_virtual_disk( [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] session._wait_for_task(vmdk_copy_task) [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return self.wait_for_task(task_ref) [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return evt.wait() [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] result = hub.switch() [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 897.605842] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return self.greenlet.switch() [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self.f(*self.args, **self.kw) [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] raise exceptions.translate_fault(task_info.error) [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Faults: ['InvalidArgument'] [ 897.606273] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] [ 897.606273] env[68571]: INFO nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Terminating instance [ 897.607813] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 897.607963] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 897.608234] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-46f986c3-619d-431b-ae1d-d55dfaba4807 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.610770] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 897.610770] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 897.611406] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-55bf1024-4211-4d3c-9025-53ea74014608 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.617860] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 897.618108] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-7b2f153e-a75b-4837-8017-3c941c9b4511 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.621352] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 897.621537] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 897.622530] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39efee71-2320-404e-bbad-bd6fd01cdf11 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.627228] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 897.627228] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52276b78-58c0-7cfb-baeb-c06bf65fe568" [ 897.627228] env[68571]: _type = "Task" [ 897.627228] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 897.634664] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52276b78-58c0-7cfb-baeb-c06bf65fe568, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 897.682038] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 897.682283] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 897.682466] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleting the datastore file [datastore1] 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 897.682730] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c569d8e3-2303-4b53-ad43-b456eaf5c9ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 897.688753] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 897.688753] env[68571]: value = "task-3467648" [ 897.688753] env[68571]: _type = "Task" [ 897.688753] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 897.696245] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467648, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 898.137593] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 898.137914] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating directory with path [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 898.138146] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5086a0e3-7e6c-49a4-a5b4-7f7df6c13172 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.151300] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Created directory with path [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 898.151562] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Fetch image to [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 898.151797] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 898.152604] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea438ab-f305-464b-8590-fdafeba36d98 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.159817] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481ec51c-6369-4c03-9f81-a806b676e0dc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.169401] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-250c0932-7112-4d43-8621-58e90c4b5693 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.203435] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fe7ebad8-49bb-42c2-a46c-cec99d619811 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.210560] env[68571]: DEBUG oslo_vmware.api [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467648, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073818} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 898.212093] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 898.212285] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 898.212463] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 898.212641] env[68571]: INFO nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Took 0.60 seconds to destroy the instance on the hypervisor. [ 898.214387] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-cd18ff69-d8da-440c-9fb0-6c059ef6929e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.216307] env[68571]: DEBUG nova.compute.claims [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 898.216487] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 898.216700] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 898.237541] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 898.301874] env[68571]: DEBUG oslo_vmware.rw_handles [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 898.362319] env[68571]: DEBUG oslo_vmware.rw_handles [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 898.362486] env[68571]: DEBUG oslo_vmware.rw_handles [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 898.640503] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-975c616a-1566-4b31-8cbe-82ca76b8978c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.648853] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5c2b516-bdb8-4218-87bb-dcd506d48bf2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.679872] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0e6b643-b295-4755-9ac6-221b4f6c7043 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.686899] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c2b74292-3629-4133-9b95-dc56de158356 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 898.700036] env[68571]: DEBUG nova.compute.provider_tree [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 898.708751] env[68571]: DEBUG nova.scheduler.client.report [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 898.722364] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.506s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 898.722895] env[68571]: ERROR nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.722895] env[68571]: Faults: ['InvalidArgument'] [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Traceback (most recent call last): [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self.driver.spawn(context, instance, image_meta, [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self._fetch_image_if_missing(context, vi) [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] image_cache(vi, tmp_image_ds_loc) [ 898.722895] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] vm_util.copy_virtual_disk( [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] session._wait_for_task(vmdk_copy_task) [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return self.wait_for_task(task_ref) [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return evt.wait() [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] result = hub.switch() [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] return self.greenlet.switch() [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 898.723291] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] self.f(*self.args, **self.kw) [ 898.723717] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 898.723717] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] raise exceptions.translate_fault(task_info.error) [ 898.723717] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 898.723717] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Faults: ['InvalidArgument'] [ 898.723717] env[68571]: ERROR nova.compute.manager [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] [ 898.723717] env[68571]: DEBUG nova.compute.utils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 898.724944] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Build of instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d was re-scheduled: A specified parameter was not correct: fileType [ 898.724944] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 898.725331] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 898.725506] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 898.725673] env[68571]: DEBUG nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 898.725838] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 899.047196] env[68571]: DEBUG nova.network.neutron [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 899.059796] env[68571]: INFO nova.compute.manager [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d] Took 0.33 seconds to deallocate network for instance. [ 899.158964] env[68571]: INFO nova.scheduler.client.report [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleted allocations for instance 349dd3c9-5769-458c-b7fa-ef08ce7d6b5d [ 899.183394] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dc9b35e2-fef5-47ac-aab5-4df895285221 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "349dd3c9-5769-458c-b7fa-ef08ce7d6b5d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 342.031s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 899.198451] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 899.245138] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 899.245382] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 899.246893] env[68571]: INFO nova.compute.claims [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 899.585834] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ed04ccb-41ae-40f4-a728-82cad3935641 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.593754] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9106d99f-f85c-4fe4-af95-e3ab23b9daec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.624548] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78f90295-3c5d-4016-8c4c-4933880d3f99 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.631602] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-687aecf0-0c5c-4791-a2f8-b9b13f1987bb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.644442] env[68571]: DEBUG nova.compute.provider_tree [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 899.653274] env[68571]: DEBUG nova.scheduler.client.report [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 899.667501] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.422s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 899.667984] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 899.701319] env[68571]: DEBUG nova.compute.utils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 899.702793] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 899.702968] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 899.711474] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 899.771532] env[68571]: DEBUG nova.policy [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '75464428d107469f99f4308cfdb6b2df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '506bd7cf3d9c4c54aabe7ef0be376fe9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 899.777638] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 899.800207] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 899.800478] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 899.800652] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 899.800838] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 899.800984] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 899.802488] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 899.802488] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 899.802488] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 899.802488] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 899.802488] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 899.802840] env[68571]: DEBUG nova.virt.hardware [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 899.803288] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d877e3f6-a566-4365-a921-4df88e9e82e9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 899.815220] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12b0b68b-ae0b-4af6-ad7d-800276c0aa7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.098336] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Successfully created port: 5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 900.693669] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Successfully updated port: 5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 900.706797] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 900.707490] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 900.707490] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 900.746362] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 900.904075] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Updating instance_info_cache with network_info: [{"id": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "address": "fa:16:3e:2e:64:ba", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d2f10d8-5a", "ovs_interfaceid": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 900.916671] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 900.917152] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance network_info: |[{"id": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "address": "fa:16:3e:2e:64:ba", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d2f10d8-5a", "ovs_interfaceid": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 900.918073] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:2e:64:ba', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '5d2f10d8-5a8e-48f9-970a-c361a18d8f3a', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 900.925708] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating folder: Project (506bd7cf3d9c4c54aabe7ef0be376fe9). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.926313] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c2cf2c82-8c4e-4b7f-85c5-d2bde7b29ee2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.936628] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created folder: Project (506bd7cf3d9c4c54aabe7ef0be376fe9) in parent group-v692787. [ 900.936823] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating folder: Instances. Parent ref: group-v692844. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 900.937379] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0c2c7765-1123-43bc-9335-840b864267fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.946583] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created folder: Instances in parent group-v692844. [ 900.946814] env[68571]: DEBUG oslo.service.loopingcall [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 900.946993] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 900.947202] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-c3d04bc2-e107-4a7c-96b2-0a8e09158d15 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 900.965246] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 900.965246] env[68571]: value = "task-3467651" [ 900.965246] env[68571]: _type = "Task" [ 900.965246] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 900.974260] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467651, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 901.284376] env[68571]: DEBUG nova.compute.manager [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Received event network-vif-plugged-5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 901.284506] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Acquiring lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 901.284698] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 901.284870] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 901.285049] env[68571]: DEBUG nova.compute.manager [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] No waiting events found dispatching network-vif-plugged-5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 901.285232] env[68571]: WARNING nova.compute.manager [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Received unexpected event network-vif-plugged-5d2f10d8-5a8e-48f9-970a-c361a18d8f3a for instance with vm_state building and task_state spawning. [ 901.285383] env[68571]: DEBUG nova.compute.manager [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Received event network-changed-5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 901.285532] env[68571]: DEBUG nova.compute.manager [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Refreshing instance network info cache due to event network-changed-5d2f10d8-5a8e-48f9-970a-c361a18d8f3a. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 901.285712] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Acquiring lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 901.285845] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Acquired lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 901.285998] env[68571]: DEBUG nova.network.neutron [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Refreshing network info cache for port 5d2f10d8-5a8e-48f9-970a-c361a18d8f3a {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 901.474484] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467651, 'name': CreateVM_Task, 'duration_secs': 0.290404} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 901.474657] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 901.475317] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 901.475479] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 901.475784] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 901.476043] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4197107a-e94e-46cc-bda4-7c714a8b67ca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 901.480385] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 901.480385] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]525c7938-0324-9dd6-d97a-3356ad1ddea4" [ 901.480385] env[68571]: _type = "Task" [ 901.480385] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 901.487733] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]525c7938-0324-9dd6-d97a-3356ad1ddea4, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 901.534715] env[68571]: DEBUG nova.network.neutron [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Updated VIF entry in instance network info cache for port 5d2f10d8-5a8e-48f9-970a-c361a18d8f3a. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 901.535080] env[68571]: DEBUG nova.network.neutron [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Updating instance_info_cache with network_info: [{"id": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "address": "fa:16:3e:2e:64:ba", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap5d2f10d8-5a", "ovs_interfaceid": "5d2f10d8-5a8e-48f9-970a-c361a18d8f3a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 901.544075] env[68571]: DEBUG oslo_concurrency.lockutils [req-fe4d190e-d34c-49ef-aa17-0bb72e642a53 req-c67a9d85-76cf-440f-9245-fd4162ef0ee0 service nova] Releasing lock "refresh_cache-3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 901.992678] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 901.992678] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 901.992678] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 913.360484] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 932.490751] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 933.490186] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 934.490128] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 934.502043] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.502547] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.502547] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 934.502670] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 934.504095] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c45dc12b-2fa7-4444-9d48-f6efcad69bf3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.512751] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed0a22fc-121d-4152-b5fa-4ed14303f934 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.526861] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14f30898-5c46-4f67-bc75-23d70c2ffb41 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.533146] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7cd1032-952c-433a-9a90-0628ad695be2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 934.563026] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180915MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 934.563197] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 934.563390] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 934.635934] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636109] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636247] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636364] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636482] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636599] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636713] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636828] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.636940] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.637063] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 934.647512] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 394b41bd-e7f7-4a77-87d1-6777e0991d50 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.657977] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.667714] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3986e039-9ed6-46e4-82b0-d3079bc45624 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.678523] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7f9587d5-7089-4e51-961e-88e83c573cb3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.688031] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 249cf445-30fa-4de2-b09d-b8210eb3effa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.697618] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.707688] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 18849294-d11e-40ed-9c2a-7706f7409d9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.717299] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1df720d6-655c-49b6-a65d-d56b757143a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.726708] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2a80d267-c2f0-4745-b23b-24717e4d9531 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.736226] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466616ca-0cad-4561-b0d6-1e34e3243418 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.745306] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4bc7288c-3483-46ed-9c4f-673f86b10446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.756122] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d910cc12-8da2-4ce6-9107-a54c870405de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.765225] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d5416006-57fd-4966-90d2-3ba18d3eceba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.775579] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 812bbf03-e2c0-4827-8ed9-cc60611a77ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.785940] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4c70fd0e-9872-423e-8b7e-4c17760d88bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.796246] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 934.796488] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 934.796635] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 935.093929] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44989fbb-ba98-4cc1-ae45-5874d8319fc3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.102409] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cceba7b-6129-4ebe-a4bf-6e7f011fa0f4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.130831] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1ea0a9f-f3b5-4f5a-9279-3d6581be0b68 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.137424] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a1dbf5c-19d6-4286-9dd9-fe29fc63fe88 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 935.149938] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 935.159128] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 935.172070] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 935.172272] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.609s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 936.173011] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.484722] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 936.489286] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.488761] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.489122] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 938.489122] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 938.512839] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.513166] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.513356] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.513532] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.513702] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.513865] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.514035] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.514198] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.514358] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.514513] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 938.514675] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 938.515228] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.515473] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 938.515795] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 943.870751] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 943.871067] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 946.484024] env[68571]: WARNING oslo_vmware.rw_handles [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 946.484024] env[68571]: ERROR oslo_vmware.rw_handles [ 946.484724] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 946.486187] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 946.486421] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Copying Virtual Disk [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/f7a95201-4233-48eb-90fe-627ec83eb749/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 946.486713] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5091bceb-731c-4b8b-93d8-abf1a1011f10 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 946.494417] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 946.494417] env[68571]: value = "task-3467652" [ 946.494417] env[68571]: _type = "Task" [ 946.494417] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 946.504352] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467652, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.006028] env[68571]: DEBUG oslo_vmware.exceptions [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 947.006028] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 947.006028] env[68571]: ERROR nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.006028] env[68571]: Faults: ['InvalidArgument'] [ 947.006028] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Traceback (most recent call last): [ 947.006028] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 947.006028] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] yield resources [ 947.006028] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 947.006028] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self.driver.spawn(context, instance, image_meta, [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self._vmops.spawn(context, instance, image_meta, injected_files, [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self._fetch_image_if_missing(context, vi) [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] image_cache(vi, tmp_image_ds_loc) [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] vm_util.copy_virtual_disk( [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] session._wait_for_task(vmdk_copy_task) [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return self.wait_for_task(task_ref) [ 947.006348] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return evt.wait() [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] result = hub.switch() [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return self.greenlet.switch() [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self.f(*self.args, **self.kw) [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] raise exceptions.translate_fault(task_info.error) [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Faults: ['InvalidArgument'] [ 947.006664] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] [ 947.006969] env[68571]: INFO nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Terminating instance [ 947.007420] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 947.007420] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.007564] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9c0ab5f7-5db5-4baa-b0e6-9ab0af400ff3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.009790] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 947.009979] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 947.010763] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-515e5ffa-722f-4fb8-89a9-7fea0f74e760 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.017567] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 947.017771] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-739532f5-6879-46d7-afb0-42859f33619f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.019951] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.020143] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 947.021098] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c4be7924-6a09-4e4b-8d1e-e88b6dd757cb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.025591] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for the task: (returnval){ [ 947.025591] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5222e1bb-d695-b793-75df-868be7bf3a34" [ 947.025591] env[68571]: _type = "Task" [ 947.025591] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.032872] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5222e1bb-d695-b793-75df-868be7bf3a34, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.084917] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 947.085160] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 947.085341] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleting the datastore file [datastore1] 15eb6744-4b26-4d7a-8639-cb3bd13e3726 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 947.085608] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1071b3a1-44da-4d7c-a7d5-22a99f6c63f1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.094656] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for the task: (returnval){ [ 947.094656] env[68571]: value = "task-3467654" [ 947.094656] env[68571]: _type = "Task" [ 947.094656] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 947.102409] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467654, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 947.536454] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 947.536780] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Creating directory with path [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 947.536867] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-195efaed-916f-4d29-8102-373b8d6f1f63 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.548218] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Created directory with path [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 947.548416] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Fetch image to [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 947.548583] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 947.549320] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-877bf9db-5b94-4882-915a-f20b498de3d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.555711] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da95d682-e0b1-4396-a647-8161e3a61297 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.564504] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f7f6680-13e3-40d4-89f4-ca0484e218e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.594939] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-640a204e-d68a-40cb-9246-acece9cce388 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.606775] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-ae7ebf25-05b7-43a8-af3f-861a4186a1f3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 947.608432] env[68571]: DEBUG oslo_vmware.api [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Task: {'id': task-3467654, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.079222} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 947.608665] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 947.608845] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 947.609012] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 947.609195] env[68571]: INFO nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Took 0.60 seconds to destroy the instance on the hypervisor. [ 947.611272] env[68571]: DEBUG nova.compute.claims [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 947.611438] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 947.611647] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 947.628605] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 947.679872] env[68571]: DEBUG oslo_vmware.rw_handles [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 947.737803] env[68571]: DEBUG oslo_vmware.rw_handles [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 947.737972] env[68571]: DEBUG oslo_vmware.rw_handles [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 948.009196] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d349adc1-2435-4b9c-8adf-a8d35f188ff8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.017417] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-577508db-889d-4b23-8646-62b6dc68e019 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.046212] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-68fec592-d97a-47d6-a3cf-118831b85f92 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.054329] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-87f61e85-9bc7-4fb8-b9a9-7f4dae5eb547 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.066401] env[68571]: DEBUG nova.compute.provider_tree [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 948.075095] env[68571]: DEBUG nova.scheduler.client.report [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 948.093257] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.481s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.093808] env[68571]: ERROR nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.093808] env[68571]: Faults: ['InvalidArgument'] [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Traceback (most recent call last): [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self.driver.spawn(context, instance, image_meta, [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self._vmops.spawn(context, instance, image_meta, injected_files, [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self._fetch_image_if_missing(context, vi) [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] image_cache(vi, tmp_image_ds_loc) [ 948.093808] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] vm_util.copy_virtual_disk( [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] session._wait_for_task(vmdk_copy_task) [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return self.wait_for_task(task_ref) [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return evt.wait() [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] result = hub.switch() [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] return self.greenlet.switch() [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 948.094103] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] self.f(*self.args, **self.kw) [ 948.094370] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 948.094370] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] raise exceptions.translate_fault(task_info.error) [ 948.094370] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 948.094370] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Faults: ['InvalidArgument'] [ 948.094370] env[68571]: ERROR nova.compute.manager [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] [ 948.094634] env[68571]: DEBUG nova.compute.utils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 948.095958] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Build of instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 was re-scheduled: A specified parameter was not correct: fileType [ 948.095958] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 948.096854] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 948.097068] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 948.097253] env[68571]: DEBUG nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 948.097419] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.612471] env[68571]: DEBUG nova.network.neutron [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.622604] env[68571]: INFO nova.compute.manager [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Took 0.52 seconds to deallocate network for instance. [ 948.714110] env[68571]: INFO nova.scheduler.client.report [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Deleted allocations for instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 [ 948.734522] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfec1e23-eefa-4e62-b178-19c6f9697099 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 387.431s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.735647] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 188.262s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 948.735857] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Acquiring lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.736073] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 948.736240] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.738192] env[68571]: INFO nova.compute.manager [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Terminating instance [ 948.739830] env[68571]: DEBUG nova.compute.manager [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 948.739972] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 948.740762] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-7725dc2c-fcbc-49b1-8da3-76100ee57aaa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.749429] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56a6513d-9b31-4a78-9d35-58576ba5d0ec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 948.760862] env[68571]: DEBUG nova.compute.manager [None req-fffdb21b-53e3-451f-b39c-232bf6564e63 tempest-ServerTagsTestJSON-2108165716 tempest-ServerTagsTestJSON-2108165716-project-member] [instance: 394b41bd-e7f7-4a77-87d1-6777e0991d50] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 948.780423] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 15eb6744-4b26-4d7a-8639-cb3bd13e3726 could not be found. [ 948.780628] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 948.780889] env[68571]: INFO nova.compute.manager [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Took 0.04 seconds to destroy the instance on the hypervisor. [ 948.781151] env[68571]: DEBUG oslo.service.loopingcall [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 948.781489] env[68571]: DEBUG nova.compute.manager [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 948.781594] env[68571]: DEBUG nova.network.neutron [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 948.785713] env[68571]: DEBUG nova.compute.manager [None req-fffdb21b-53e3-451f-b39c-232bf6564e63 tempest-ServerTagsTestJSON-2108165716 tempest-ServerTagsTestJSON-2108165716-project-member] [instance: 394b41bd-e7f7-4a77-87d1-6777e0991d50] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 948.805801] env[68571]: DEBUG nova.network.neutron [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 948.818163] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fffdb21b-53e3-451f-b39c-232bf6564e63 tempest-ServerTagsTestJSON-2108165716 tempest-ServerTagsTestJSON-2108165716-project-member] Lock "394b41bd-e7f7-4a77-87d1-6777e0991d50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.353s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 948.823679] env[68571]: INFO nova.compute.manager [-] [instance: 15eb6744-4b26-4d7a-8639-cb3bd13e3726] Took 0.04 seconds to deallocate network for instance. [ 948.830850] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 948.877332] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 948.877579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 948.879149] env[68571]: INFO nova.compute.claims [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 948.914294] env[68571]: DEBUG oslo_concurrency.lockutils [None req-71acfe76-475a-43ff-ac2f-cf1535d83ff9 tempest-ServersAdminTestJSON-1356374649 tempest-ServersAdminTestJSON-1356374649-project-member] Lock "15eb6744-4b26-4d7a-8639-cb3bd13e3726" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.252485] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-929da407-8263-45d5-aa22-16d0a95d6e6c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.260371] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96ea5a4b-0810-4bc2-a4c7-311de61329fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.292708] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d9c3fe4c-de14-4b72-966a-ab73e3ec671a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.299636] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fcc073ee-38ed-4fd2-8ece-b1c3b8afe328 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.315028] env[68571]: DEBUG nova.compute.provider_tree [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 949.323856] env[68571]: DEBUG nova.scheduler.client.report [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 949.337408] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.460s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 949.337864] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 949.374790] env[68571]: DEBUG nova.compute.utils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 949.379018] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 949.379018] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 949.386123] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 949.451223] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 949.462658] env[68571]: DEBUG nova.policy [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f340966281a848b783b75a5c89986e6f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '077f1a9875da491ab41f825a6faab831', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 949.478311] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:29:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='fb2a59dd-74cd-4d85-a452-9b5582f6808c',id=37,is_public=True,memory_mb=128,name='tempest-test_resize_flavor_-1652956527',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 949.478779] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 949.478779] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 949.478932] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 949.479065] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 949.481013] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 949.481013] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 949.481013] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 949.481013] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 949.481013] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 949.481213] env[68571]: DEBUG nova.virt.hardware [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 949.481213] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56335359-b57b-45a2-829f-6f15dcd9d4fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.489015] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9660fecd-fa64-472f-9bc8-3964c868b56a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 949.764525] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Successfully created port: b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 950.355475] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Successfully updated port: b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 950.376460] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 950.376460] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 950.376460] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 950.415305] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 950.537129] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "b60eb700-434f-4bea-a84f-9071402001c3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 950.689039] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Updating instance_info_cache with network_info: [{"id": "b8d64d34-b317-489b-91e2-55b8239349e1", "address": "fa:16:3e:7c:e7:f1", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8d64d34-b3", "ovs_interfaceid": "b8d64d34-b317-489b-91e2-55b8239349e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 950.703396] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 950.704434] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance network_info: |[{"id": "b8d64d34-b317-489b-91e2-55b8239349e1", "address": "fa:16:3e:7c:e7:f1", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8d64d34-b3", "ovs_interfaceid": "b8d64d34-b317-489b-91e2-55b8239349e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 950.705480] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7c:e7:f1', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b8d64d34-b317-489b-91e2-55b8239349e1', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 950.713413] env[68571]: DEBUG oslo.service.loopingcall [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 950.713859] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 950.714119] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-9ac50677-7f36-4ba7-8e55-ff82d2b66756 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 950.734721] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 950.734721] env[68571]: value = "task-3467655" [ 950.734721] env[68571]: _type = "Task" [ 950.734721] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 950.743810] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467655, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 951.047618] env[68571]: DEBUG nova.compute.manager [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Received event network-vif-plugged-b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 951.047618] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Acquiring lock "b60eb700-434f-4bea-a84f-9071402001c3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 951.047618] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Lock "b60eb700-434f-4bea-a84f-9071402001c3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 951.047618] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Lock "b60eb700-434f-4bea-a84f-9071402001c3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 951.048230] env[68571]: DEBUG nova.compute.manager [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] No waiting events found dispatching network-vif-plugged-b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 951.048230] env[68571]: WARNING nova.compute.manager [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Received unexpected event network-vif-plugged-b8d64d34-b317-489b-91e2-55b8239349e1 for instance with vm_state building and task_state deleting. [ 951.048230] env[68571]: DEBUG nova.compute.manager [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Received event network-changed-b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 951.048230] env[68571]: DEBUG nova.compute.manager [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Refreshing instance network info cache due to event network-changed-b8d64d34-b317-489b-91e2-55b8239349e1. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 951.048230] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Acquiring lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 951.048351] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Acquired lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 951.048351] env[68571]: DEBUG nova.network.neutron [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Refreshing network info cache for port b8d64d34-b317-489b-91e2-55b8239349e1 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 951.246335] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467655, 'name': CreateVM_Task, 'duration_secs': 0.28432} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 951.246513] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 951.247370] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 951.247599] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 951.247974] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 951.248297] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-98684e49-3c08-4b6c-877b-f67f0c67705d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 951.253094] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 951.253094] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e80596-27a1-de97-e3db-24e9dfdab798" [ 951.253094] env[68571]: _type = "Task" [ 951.253094] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 951.260906] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e80596-27a1-de97-e3db-24e9dfdab798, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 951.699093] env[68571]: DEBUG nova.network.neutron [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Updated VIF entry in instance network info cache for port b8d64d34-b317-489b-91e2-55b8239349e1. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 951.699442] env[68571]: DEBUG nova.network.neutron [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Updating instance_info_cache with network_info: [{"id": "b8d64d34-b317-489b-91e2-55b8239349e1", "address": "fa:16:3e:7c:e7:f1", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8d64d34-b3", "ovs_interfaceid": "b8d64d34-b317-489b-91e2-55b8239349e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 951.713600] env[68571]: DEBUG oslo_concurrency.lockutils [req-c1eb4003-9d52-4b3a-8261-fd7beedb1ddc req-6d4fb8bd-e1db-4a3a-817e-7b6934b2fea4 service nova] Releasing lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 951.764103] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 951.764359] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 951.764568] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 963.472259] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "afe033a3-4e04-4249-beed-169a3e40a721" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 963.472510] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 965.375503] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b1ee3429-661c-409b-8e03-a2631602de55 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "14dee505-e30a-4395-9fe3-fb505492c4df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 965.375824] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b1ee3429-661c-409b-8e03-a2631602de55 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "14dee505-e30a-4395-9fe3-fb505492c4df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 973.081976] env[68571]: DEBUG oslo_concurrency.lockutils [None req-36bb7c0f-fa85-4230-86c5-862f959b0fc2 tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Acquiring lock "73f10282-d15a-4d6b-a0b9-5b3cb8764ff9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 973.082504] env[68571]: DEBUG oslo_concurrency.lockutils [None req-36bb7c0f-fa85-4230-86c5-862f959b0fc2 tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Lock "73f10282-d15a-4d6b-a0b9-5b3cb8764ff9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 973.984999] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3a791b5-ff4e-43d4-809f-aed71cd29977 tempest-ServerMetadataNegativeTestJSON-5225460 tempest-ServerMetadataNegativeTestJSON-5225460-project-member] Acquiring lock "e9b8ab85-e972-4081-ae38-602a92fe3ab9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 973.985246] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3a791b5-ff4e-43d4-809f-aed71cd29977 tempest-ServerMetadataNegativeTestJSON-5225460 tempest-ServerMetadataNegativeTestJSON-5225460-project-member] Lock "e9b8ab85-e972-4081-ae38-602a92fe3ab9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 981.588680] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1af6047b-ff48-44e5-93a8-da896cca9e6b tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Acquiring lock "061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 981.588958] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1af6047b-ff48-44e5-93a8-da896cca9e6b tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 982.973296] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e16fb427-6ffc-4a9b-bcb0-b5513bc1c992 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Acquiring lock "7a137e14-98ec-4718-8ff4-3700d2ef7ee9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 982.973561] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e16fb427-6ffc-4a9b-bcb0-b5513bc1c992 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Lock "7a137e14-98ec-4718-8ff4-3700d2ef7ee9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 990.204129] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39f896c9-3af0-41d8-980f-b379b470abaf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "9cace51b-100c-48d0-813c-eb31ec9384ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 990.204440] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39f896c9-3af0-41d8-980f-b379b470abaf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "9cace51b-100c-48d0-813c-eb31ec9384ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 992.312626] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8579d8f0-61aa-41e4-a5b5-f9df996c1a62 tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Acquiring lock "c8be0938-4b38-4e05-8afa-202d87a315b7" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 992.312931] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8579d8f0-61aa-41e4-a5b5-f9df996c1a62 tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Lock "c8be0938-4b38-4e05-8afa-202d87a315b7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 993.265032] env[68571]: WARNING oslo_vmware.rw_handles [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 993.265032] env[68571]: ERROR oslo_vmware.rw_handles [ 993.265530] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 993.267664] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 993.267929] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Copying Virtual Disk [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/af0967db-10c8-4d5d-ad1e-8078c8bbafeb/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 993.268251] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-83a2eb4d-d998-4a0f-9ae5-7085931f2983 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.276512] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for the task: (returnval){ [ 993.276512] env[68571]: value = "task-3467656" [ 993.276512] env[68571]: _type = "Task" [ 993.276512] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 993.284867] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Task: {'id': task-3467656, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 993.490249] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 993.789039] env[68571]: DEBUG oslo_vmware.exceptions [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 993.789309] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 993.789959] env[68571]: ERROR nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 993.789959] env[68571]: Faults: ['InvalidArgument'] [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Traceback (most recent call last): [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] yield resources [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self.driver.spawn(context, instance, image_meta, [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self._fetch_image_if_missing(context, vi) [ 993.789959] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] image_cache(vi, tmp_image_ds_loc) [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] vm_util.copy_virtual_disk( [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] session._wait_for_task(vmdk_copy_task) [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return self.wait_for_task(task_ref) [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return evt.wait() [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] result = hub.switch() [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 993.790560] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return self.greenlet.switch() [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self.f(*self.args, **self.kw) [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] raise exceptions.translate_fault(task_info.error) [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Faults: ['InvalidArgument'] [ 993.791199] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] [ 993.791199] env[68571]: INFO nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Terminating instance [ 993.791858] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 993.792067] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 993.792728] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 993.792906] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 993.793145] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ff6ba067-5aa2-4732-8f2d-16573e67c055 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.795455] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-165f24cb-061e-4d2d-a8df-182c43799c6a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.802387] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 993.802604] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-6a0c1b77-7b15-4d9f-bee6-f2d4155b336f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.804834] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 993.804998] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 993.805933] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-48105cf2-718a-4285-8d99-924cc79f5187 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.810662] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 993.810662] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52728642-c5f9-424b-08f4-25cb7a2743a3" [ 993.810662] env[68571]: _type = "Task" [ 993.810662] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 993.817993] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52728642-c5f9-424b-08f4-25cb7a2743a3, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 993.871822] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 993.872053] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 993.872239] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Deleting the datastore file [datastore1] e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 993.872507] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-6859cf8c-3089-40cb-bff1-d30c44442729 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 993.878879] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for the task: (returnval){ [ 993.878879] env[68571]: value = "task-3467658" [ 993.878879] env[68571]: _type = "Task" [ 993.878879] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 993.887050] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Task: {'id': task-3467658, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 994.321399] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 994.321654] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 994.321887] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7ece8821-2d73-4e26-b6c7-e15d5aa96948 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.333610] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 994.333819] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Fetch image to [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 994.333986] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 994.334744] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b4230b7a-f458-484d-98bb-2f680d443dba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.341696] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21382e5b-ddfb-4b89-b5a2-b81009af5049 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.350788] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba1a729c-6448-431f-8a97-0ab06bbba7fc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.383832] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b67102c7-90e7-4add-a0d2-1573f86ca45b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.391146] env[68571]: DEBUG oslo_vmware.api [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Task: {'id': task-3467658, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077171} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 994.392648] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 994.392837] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 994.393031] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 994.394133] env[68571]: INFO nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Took 0.60 seconds to destroy the instance on the hypervisor. [ 994.394962] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-64f65e94-17e7-4acc-9c62-bcdc28479569 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.396918] env[68571]: DEBUG nova.compute.claims [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 994.397109] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.397325] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 994.419562] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 994.476184] env[68571]: DEBUG oslo_vmware.rw_handles [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 994.532907] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 994.535920] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 994.539592] env[68571]: DEBUG oslo_vmware.rw_handles [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 994.539773] env[68571]: DEBUG oslo_vmware.rw_handles [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 994.552672] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.837988] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-592bad34-c487-48d6-9e59-4726b8fb12f8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.845654] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0fcdd5d8-3072-4877-8444-a9d3e8c3849f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.874562] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-feb4b939-5a76-4148-8b49-04c8a702e9d6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.883017] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd8782f2-0a7e-4481-83a7-55d0ca00a873 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.894215] env[68571]: DEBUG nova.compute.provider_tree [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 994.902816] env[68571]: DEBUG nova.scheduler.client.report [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 994.917908] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.520s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 994.918434] env[68571]: ERROR nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.918434] env[68571]: Faults: ['InvalidArgument'] [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Traceback (most recent call last): [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self.driver.spawn(context, instance, image_meta, [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self._fetch_image_if_missing(context, vi) [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] image_cache(vi, tmp_image_ds_loc) [ 994.918434] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] vm_util.copy_virtual_disk( [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] session._wait_for_task(vmdk_copy_task) [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return self.wait_for_task(task_ref) [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return evt.wait() [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] result = hub.switch() [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] return self.greenlet.switch() [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 994.918943] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] self.f(*self.args, **self.kw) [ 994.919486] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 994.919486] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] raise exceptions.translate_fault(task_info.error) [ 994.919486] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 994.919486] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Faults: ['InvalidArgument'] [ 994.919486] env[68571]: ERROR nova.compute.manager [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] [ 994.919486] env[68571]: DEBUG nova.compute.utils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 994.920219] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.368s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 994.920400] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 994.920555] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 994.921250] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Build of instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e was re-scheduled: A specified parameter was not correct: fileType [ 994.921250] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 994.921616] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 994.921784] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 994.921948] env[68571]: DEBUG nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 994.922177] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 994.924302] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24904d34-360c-4c11-859f-643f972fc2e4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.932601] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a96092-7fda-49c5-a1b7-9cab9426d47d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.950863] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4da9b49b-e9af-4445-a2f9-bd690174fa26 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.956684] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c11313e3-97d2-45bb-8558-00a031a4d764 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 994.985757] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180929MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 994.985923] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 994.986138] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.071891] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.072070] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072230] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072324] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072444] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072562] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072673] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072785] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.072923] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.073109] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 995.084992] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.093947] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 18849294-d11e-40ed-9c2a-7706f7409d9a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.103292] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1df720d6-655c-49b6-a65d-d56b757143a7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.112293] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2a80d267-c2f0-4745-b23b-24717e4d9531 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.121941] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466616ca-0cad-4561-b0d6-1e34e3243418 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.131340] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4bc7288c-3483-46ed-9c4f-673f86b10446 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.141560] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d910cc12-8da2-4ce6-9107-a54c870405de has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.151380] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d5416006-57fd-4966-90d2-3ba18d3eceba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.163174] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 812bbf03-e2c0-4827-8ed9-cc60611a77ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.174207] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4c70fd0e-9872-423e-8b7e-4c17760d88bc has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.184563] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.193758] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.202752] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.211896] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 14dee505-e30a-4395-9fe3-fb505492c4df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.229091] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73f10282-d15a-4d6b-a0b9-5b3cb8764ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.241431] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e9b8ab85-e972-4081-ae38-602a92fe3ab9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.254304] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.260674] env[68571]: DEBUG nova.network.neutron [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 995.266342] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a137e14-98ec-4718-8ff4-3700d2ef7ee9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.270552] env[68571]: INFO nova.compute.manager [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Took 0.35 seconds to deallocate network for instance. [ 995.279917] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9cace51b-100c-48d0-813c-eb31ec9384ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.290664] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c8be0938-4b38-4e05-8afa-202d87a315b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 995.290664] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 9 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 995.290664] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1664MB phys_disk=200GB used_disk=9GB total_vcpus=48 used_vcpus=9 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 995.361142] env[68571]: INFO nova.scheduler.client.report [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Deleted allocations for instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e [ 995.387881] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2b8a9e35-56cd-4207-8c7e-88cb597cc58e tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 433.088s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.389080] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 234.754s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.389334] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Acquiring lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.389539] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.389707] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.392133] env[68571]: INFO nova.compute.manager [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Terminating instance [ 995.396840] env[68571]: DEBUG nova.compute.manager [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 995.396840] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 995.396954] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-edf8eef3-ab7a-4ebb-b6b3-3ec37c4b0009 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.402997] env[68571]: DEBUG nova.compute.manager [None req-115788d6-9044-4382-b713-cec406ac795e tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 3986e039-9ed6-46e4-82b0-d3079bc45624] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.408357] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0c1bea3-55e5-4790-83f6-9a590238932d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.429181] env[68571]: DEBUG nova.compute.manager [None req-115788d6-9044-4382-b713-cec406ac795e tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 3986e039-9ed6-46e4-82b0-d3079bc45624] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 995.439315] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e could not be found. [ 995.439530] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 995.439707] env[68571]: INFO nova.compute.manager [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 995.439948] env[68571]: DEBUG oslo.service.loopingcall [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 995.442603] env[68571]: DEBUG nova.compute.manager [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 995.442772] env[68571]: DEBUG nova.network.neutron [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 995.463564] env[68571]: DEBUG oslo_concurrency.lockutils [None req-115788d6-9044-4382-b713-cec406ac795e tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "3986e039-9ed6-46e4-82b0-d3079bc45624" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.163s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.469018] env[68571]: DEBUG nova.network.neutron [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 995.474112] env[68571]: DEBUG nova.compute.manager [None req-6add77f1-94e8-4c73-ae1a-f374dd310014 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: ca22d1a8-0a38-4e91-a3e8-8d0872d2ea31] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.479147] env[68571]: INFO nova.compute.manager [-] [instance: e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e] Took 0.03 seconds to deallocate network for instance. [ 995.498954] env[68571]: DEBUG nova.compute.manager [None req-6add77f1-94e8-4c73-ae1a-f374dd310014 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: ca22d1a8-0a38-4e91-a3e8-8d0872d2ea31] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 995.517809] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6add77f1-94e8-4c73-ae1a-f374dd310014 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "ca22d1a8-0a38-4e91-a3e8-8d0872d2ea31" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.391s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.527118] env[68571]: DEBUG nova.compute.manager [None req-2c5d925a-7912-400e-93bc-ca65894842e3 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 7f9587d5-7089-4e51-961e-88e83c573cb3] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.552358] env[68571]: DEBUG nova.compute.manager [None req-2c5d925a-7912-400e-93bc-ca65894842e3 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 7f9587d5-7089-4e51-961e-88e83c573cb3] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 995.573513] env[68571]: DEBUG oslo_concurrency.lockutils [None req-cfb7f3c7-bd01-4162-a39f-44c04ef698e6 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "e1e0f1fc-a1b7-47ff-8f78-4ad09d2a005e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.575295] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2c5d925a-7912-400e-93bc-ca65894842e3 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "7f9587d5-7089-4e51-961e-88e83c573cb3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 203.694s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.586728] env[68571]: DEBUG nova.compute.manager [None req-d839f976-7990-4507-8f98-9b719e6b52bf tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] [instance: 249cf445-30fa-4de2-b09d-b8210eb3effa] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.615120] env[68571]: DEBUG nova.compute.manager [None req-d839f976-7990-4507-8f98-9b719e6b52bf tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] [instance: 249cf445-30fa-4de2-b09d-b8210eb3effa] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 995.638027] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d839f976-7990-4507-8f98-9b719e6b52bf tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Lock "249cf445-30fa-4de2-b09d-b8210eb3effa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 201.269s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.648924] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 995.698998] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 995.702260] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ceb410dd-5583-4d33-8462-7ea6798b7224 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.709730] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce5b94b4-8dfb-486e-969d-aeee7def7ac2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.739352] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3ff02623-0df2-470c-b781-529b12e3e840 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.746531] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ccf8c38-728c-4a09-ae96-59bcc4f1e4e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 995.759812] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 995.767407] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 995.784651] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 995.784841] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.799s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 995.785122] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.086s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 995.786551] env[68571]: INFO nova.compute.claims [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 996.164738] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f43fc6d-199d-4de7-999d-7e4f989aa8f5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.173642] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07a0bb4d-9a10-456a-b3ac-1f2d56d2c1af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.202948] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44260321-5a7b-4851-be2c-fa8a0a27497e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.210332] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-937bfb04-18f7-4ec8-a289-2e1205ab2c12 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.223145] env[68571]: DEBUG nova.compute.provider_tree [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 996.231861] env[68571]: DEBUG nova.scheduler.client.report [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 996.244058] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.459s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 996.244501] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 996.278888] env[68571]: DEBUG nova.compute.utils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 996.280501] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 996.280608] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 996.290834] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 996.341847] env[68571]: DEBUG nova.policy [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ef480c5af7dd41c58adbbaee783c29c7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0fc64140a8ed4dfe95232d52e6978add', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 996.356966] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 996.383504] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 996.383811] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 996.383977] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 996.384193] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 996.384356] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 996.384516] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 996.384731] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 996.384889] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 996.385074] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 996.385240] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 996.385412] env[68571]: DEBUG nova.virt.hardware [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 996.386287] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d53fc0b4-8019-486a-b059-fa3b9c937da9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.394216] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef05307d-fa2f-40ea-923a-c9428ef6bfc2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 996.644871] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Successfully created port: 1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 996.742690] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 996.742690] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 996.767838] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 997.237718] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Successfully updated port: 1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 997.250900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 997.250900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquired lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 997.250900] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 997.295463] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 997.313127] env[68571]: DEBUG nova.compute.manager [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Received event network-vif-plugged-1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 997.313339] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Acquiring lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 997.313537] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 997.313698] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 997.313859] env[68571]: DEBUG nova.compute.manager [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] No waiting events found dispatching network-vif-plugged-1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 997.314031] env[68571]: WARNING nova.compute.manager [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Received unexpected event network-vif-plugged-1ceb4958-0759-4fb8-8f67-381c490f5ed8 for instance with vm_state building and task_state spawning. [ 997.314189] env[68571]: DEBUG nova.compute.manager [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Received event network-changed-1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 997.314344] env[68571]: DEBUG nova.compute.manager [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Refreshing instance network info cache due to event network-changed-1ceb4958-0759-4fb8-8f67-381c490f5ed8. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 997.314503] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Acquiring lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 997.465408] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Updating instance_info_cache with network_info: [{"id": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "address": "fa:16:3e:47:08:67", "network": {"id": "eeb639e1-8251-4dee-8073-7e8fc86ef5e1", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-408711133-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0fc64140a8ed4dfe95232d52e6978add", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ceb4958-07", "ovs_interfaceid": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 997.478412] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Releasing lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 997.478689] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance network_info: |[{"id": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "address": "fa:16:3e:47:08:67", "network": {"id": "eeb639e1-8251-4dee-8073-7e8fc86ef5e1", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-408711133-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0fc64140a8ed4dfe95232d52e6978add", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ceb4958-07", "ovs_interfaceid": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 997.478990] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Acquired lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 997.479214] env[68571]: DEBUG nova.network.neutron [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Refreshing network info cache for port 1ceb4958-0759-4fb8-8f67-381c490f5ed8 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 997.480256] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:47:08:67', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '1ceb4958-0759-4fb8-8f67-381c490f5ed8', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 997.487675] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Creating folder: Project (0fc64140a8ed4dfe95232d52e6978add). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.488557] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-05ed906e-6706-4b6d-838b-fb29160c58de {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.502848] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Created folder: Project (0fc64140a8ed4dfe95232d52e6978add) in parent group-v692787. [ 997.503107] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Creating folder: Instances. Parent ref: group-v692848. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 997.503291] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-adb911a6-dbcb-478d-8a2a-3f2354def8b5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.511965] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Created folder: Instances in parent group-v692848. [ 997.512218] env[68571]: DEBUG oslo.service.loopingcall [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 997.512411] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 997.512606] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-05e7f583-39ff-47ac-ba8e-2e42868aa9a9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 997.533797] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 997.533797] env[68571]: value = "task-3467661" [ 997.533797] env[68571]: _type = "Task" [ 997.533797] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 997.544281] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467661, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 997.737100] env[68571]: DEBUG nova.network.neutron [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Updated VIF entry in instance network info cache for port 1ceb4958-0759-4fb8-8f67-381c490f5ed8. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 997.737472] env[68571]: DEBUG nova.network.neutron [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Updating instance_info_cache with network_info: [{"id": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "address": "fa:16:3e:47:08:67", "network": {"id": "eeb639e1-8251-4dee-8073-7e8fc86ef5e1", "bridge": "br-int", "label": "tempest-ServersNegativeTestMultiTenantJSON-408711133-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "0fc64140a8ed4dfe95232d52e6978add", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "c4810e0b-c5e1-43ca-8d35-de29f7ebe7b0", "external-id": "nsx-vlan-transportzone-60", "segmentation_id": 60, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap1ceb4958-07", "ovs_interfaceid": "1ceb4958-0759-4fb8-8f67-381c490f5ed8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 997.746682] env[68571]: DEBUG oslo_concurrency.lockutils [req-4cefe380-ca22-4802-b01c-c819de718b93 req-5b78a328-ae34-4790-893d-d42ad837bd58 service nova] Releasing lock "refresh_cache-5e571ae2-9d45-402d-bce5-6e3721cc5374" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 998.043473] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467661, 'name': CreateVM_Task, 'duration_secs': 0.28163} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 998.043627] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 998.044299] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 998.044469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 998.044759] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 998.045011] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-444683c6-3a38-43ea-8cc6-8604362f6749 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 998.049132] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for the task: (returnval){ [ 998.049132] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]528548bb-f8e0-2910-2f5d-0b2a0bc62c05" [ 998.049132] env[68571]: _type = "Task" [ 998.049132] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 998.056407] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]528548bb-f8e0-2910-2f5d-0b2a0bc62c05, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 998.488986] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 998.559945] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 998.559945] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 998.559945] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 999.489232] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.489495] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 999.489547] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 999.516926] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517144] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517282] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517410] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517542] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517668] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517789] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.517910] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.518532] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.518532] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 999.518532] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 999.518928] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 999.520031] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1000.416224] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1000.489800] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1001.936316] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1001.936602] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1002.666036] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d4c87d72-4aff-4ed9-b707-dc0215a8c5b8 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "2c21a8e5-da7f-4b3a-97ab-ec35f794edac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1002.666277] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d4c87d72-4aff-4ed9-b707-dc0215a8c5b8 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "2c21a8e5-da7f-4b3a-97ab-ec35f794edac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1011.837865] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a1fd15ef-5912-4ad1-bb04-6276ffb5b1dc tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Acquiring lock "5af733d9-dfa4-4059-8e33-1818695c8692" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1011.837865] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a1fd15ef-5912-4ad1-bb04-6276ffb5b1dc tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "5af733d9-dfa4-4059-8e33-1818695c8692" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1014.519339] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3782ec78-11e2-4f9a-9963-60dbdb163d8e tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "a333a6c9-5119-4d2f-81f3-cb86795ed364" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1014.519957] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3782ec78-11e2-4f9a-9963-60dbdb163d8e tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "a333a6c9-5119-4d2f-81f3-cb86795ed364" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1023.024676] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5fda29f5-c038-4323-b6bc-258ade178d39 tempest-ServerGroupTestJSON-854930141 tempest-ServerGroupTestJSON-854930141-project-member] Acquiring lock "1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1023.024676] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5fda29f5-c038-4323-b6bc-258ade178d39 tempest-ServerGroupTestJSON-854930141 tempest-ServerGroupTestJSON-854930141-project-member] Lock "1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.780581] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1c3dab6-4afd-4ba5-a7d4-c16cfe2f13e8 tempest-ServerRescueTestJSONUnderV235-1698324520 tempest-ServerRescueTestJSONUnderV235-1698324520-project-member] Acquiring lock "df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.780916] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1c3dab6-4afd-4ba5-a7d4-c16cfe2f13e8 tempest-ServerRescueTestJSONUnderV235-1698324520 tempest-ServerRescueTestJSONUnderV235-1698324520-project-member] Lock "df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1026.855817] env[68571]: DEBUG oslo_concurrency.lockutils [None req-410d438e-4fda-4576-9b01-83c7e267ae75 tempest-ServersAaction247Test-231076223 tempest-ServersAaction247Test-231076223-project-member] Acquiring lock "466d2eae-c109-4286-a223-edca73d6c8fa" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1026.856117] env[68571]: DEBUG oslo_concurrency.lockutils [None req-410d438e-4fda-4576-9b01-83c7e267ae75 tempest-ServersAaction247Test-231076223 tempest-ServersAaction247Test-231076223-project-member] Lock "466d2eae-c109-4286-a223-edca73d6c8fa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1041.363450] env[68571]: WARNING oslo_vmware.rw_handles [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1041.363450] env[68571]: ERROR oslo_vmware.rw_handles [ 1041.364072] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1041.365869] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1041.366125] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Copying Virtual Disk [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/9236193a-28f0-45ef-b62f-2e370863545c/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1041.366423] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-a556f5e1-d7e3-419a-919c-402a1c4644e0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.374596] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1041.374596] env[68571]: value = "task-3467662" [ 1041.374596] env[68571]: _type = "Task" [ 1041.374596] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1041.383165] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467662, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1041.884809] env[68571]: DEBUG oslo_vmware.exceptions [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1041.885199] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1041.885820] env[68571]: ERROR nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.885820] env[68571]: Faults: ['InvalidArgument'] [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Traceback (most recent call last): [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] yield resources [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self.driver.spawn(context, instance, image_meta, [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self._fetch_image_if_missing(context, vi) [ 1041.885820] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] image_cache(vi, tmp_image_ds_loc) [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] vm_util.copy_virtual_disk( [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] session._wait_for_task(vmdk_copy_task) [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return self.wait_for_task(task_ref) [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return evt.wait() [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] result = hub.switch() [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1041.886588] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return self.greenlet.switch() [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self.f(*self.args, **self.kw) [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] raise exceptions.translate_fault(task_info.error) [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Faults: ['InvalidArgument'] [ 1041.887255] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] [ 1041.887255] env[68571]: INFO nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Terminating instance [ 1041.888269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1041.888547] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1041.888831] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-674dcaa7-407d-40eb-b0ae-b0949b8ba614 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.891251] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1041.891497] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1041.892278] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73e8d8d0-a97a-4e17-ae22-1ead8e590b9a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.898995] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1041.900134] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-656168c1-a1db-4f79-b490-05009d0ddbd9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.901774] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1041.901850] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1041.902800] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4a1e08bd-5b58-4b2b-bff2-095542f9d540 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.907783] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for the task: (returnval){ [ 1041.907783] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5255c556-4831-c6af-df28-be99d6c1dd5b" [ 1041.907783] env[68571]: _type = "Task" [ 1041.907783] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1041.924421] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1041.924800] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Creating directory with path [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1041.925162] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-df15388b-92a9-45b2-96f8-6345161e7832 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.947783] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Created directory with path [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1041.948158] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Fetch image to [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1041.948441] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1041.949314] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-551adcff-3bab-4b35-92ca-33fbbf2ec2e9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.956613] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-622278b1-e1e8-4bfc-9ef0-22377ba89c5e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.966541] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3df7d6c9-5688-4f8a-861a-30571bb49cc7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1041.971191] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1041.971395] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1041.971569] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleting the datastore file [datastore1] cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1041.972181] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fb4b3f54-680b-4f42-b52b-c5336fc7652a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.001326] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-939d6202-dc6a-41b6-bcd4-b72f05907f67 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.003982] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1042.003982] env[68571]: value = "task-3467664" [ 1042.003982] env[68571]: _type = "Task" [ 1042.003982] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1042.009081] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c94948d9-d397-4d2c-a95a-100ab7964a60 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.013277] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467664, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1042.051081] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1042.102085] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1042.161462] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1042.161695] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1042.514331] env[68571]: DEBUG oslo_vmware.api [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467664, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075453} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1042.514639] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1042.514787] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1042.515284] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1042.515492] env[68571]: INFO nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1042.517619] env[68571]: DEBUG nova.compute.claims [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1042.517816] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1042.518063] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1042.838291] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c61e9273-cbd9-4260-bc32-e3d17b980006 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.845792] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aed59f7f-0b08-4294-8083-5d86acdee4f1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.874317] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e055602-76fb-4e32-b8b9-cdaf4015613d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.881064] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58986bce-42b7-469b-896b-d7795944bfc8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1042.894556] env[68571]: DEBUG nova.compute.provider_tree [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1042.905060] env[68571]: DEBUG nova.scheduler.client.report [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1042.918074] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.400s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1042.918573] env[68571]: ERROR nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.918573] env[68571]: Faults: ['InvalidArgument'] [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Traceback (most recent call last): [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self.driver.spawn(context, instance, image_meta, [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self._fetch_image_if_missing(context, vi) [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] image_cache(vi, tmp_image_ds_loc) [ 1042.918573] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] vm_util.copy_virtual_disk( [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] session._wait_for_task(vmdk_copy_task) [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return self.wait_for_task(task_ref) [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return evt.wait() [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] result = hub.switch() [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] return self.greenlet.switch() [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1042.919038] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] self.f(*self.args, **self.kw) [ 1042.919597] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1042.919597] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] raise exceptions.translate_fault(task_info.error) [ 1042.919597] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1042.919597] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Faults: ['InvalidArgument'] [ 1042.919597] env[68571]: ERROR nova.compute.manager [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] [ 1042.919597] env[68571]: DEBUG nova.compute.utils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1042.920664] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Build of instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 was re-scheduled: A specified parameter was not correct: fileType [ 1042.920664] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1042.921047] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1042.921224] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1042.921390] env[68571]: DEBUG nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1042.921555] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.258762] env[68571]: DEBUG nova.network.neutron [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.270007] env[68571]: INFO nova.compute.manager [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Took 0.35 seconds to deallocate network for instance. [ 1043.369987] env[68571]: INFO nova.scheduler.client.report [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted allocations for instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 [ 1043.393937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-290074a0-db5a-4eb0-8a48-70149636ff01 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 477.738s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.395114] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 278.281s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1043.395347] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1043.395548] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1043.395725] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.398044] env[68571]: INFO nova.compute.manager [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Terminating instance [ 1043.400621] env[68571]: DEBUG nova.compute.manager [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1043.400621] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1043.400621] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-713a8c46-15ee-4de1-b92c-ae242fd8fa48 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1043.408964] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd89618e-6b78-43bf-b69f-b50b510c0acd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1043.420730] env[68571]: DEBUG nova.compute.manager [None req-43ac91dd-1e2a-4c19-a6eb-b9668c6993b6 tempest-FloatingIPsAssociationNegativeTestJSON-559490589 tempest-FloatingIPsAssociationNegativeTestJSON-559490589-project-member] [instance: 18849294-d11e-40ed-9c2a-7706f7409d9a] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.440401] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9 could not be found. [ 1043.440540] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1043.440684] env[68571]: INFO nova.compute.manager [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1043.440922] env[68571]: DEBUG oslo.service.loopingcall [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1043.441155] env[68571]: DEBUG nova.compute.manager [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1043.441247] env[68571]: DEBUG nova.network.neutron [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1043.445182] env[68571]: DEBUG nova.compute.manager [None req-43ac91dd-1e2a-4c19-a6eb-b9668c6993b6 tempest-FloatingIPsAssociationNegativeTestJSON-559490589 tempest-FloatingIPsAssociationNegativeTestJSON-559490589-project-member] [instance: 18849294-d11e-40ed-9c2a-7706f7409d9a] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.470027] env[68571]: DEBUG nova.network.neutron [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1043.472324] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43ac91dd-1e2a-4c19-a6eb-b9668c6993b6 tempest-FloatingIPsAssociationNegativeTestJSON-559490589 tempest-FloatingIPsAssociationNegativeTestJSON-559490589-project-member] Lock "18849294-d11e-40ed-9c2a-7706f7409d9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 234.345s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.477545] env[68571]: INFO nova.compute.manager [-] [instance: cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9] Took 0.04 seconds to deallocate network for instance. [ 1043.481948] env[68571]: DEBUG nova.compute.manager [None req-5ec4357c-b59a-488c-a1a6-459deb51b113 tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 1df720d6-655c-49b6-a65d-d56b757143a7] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.509018] env[68571]: DEBUG nova.compute.manager [None req-5ec4357c-b59a-488c-a1a6-459deb51b113 tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 1df720d6-655c-49b6-a65d-d56b757143a7] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.529555] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5ec4357c-b59a-488c-a1a6-459deb51b113 tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "1df720d6-655c-49b6-a65d-d56b757143a7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 232.111s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.544605] env[68571]: DEBUG nova.compute.manager [None req-418088bb-dc24-4b40-aeb7-a26e72151b8d tempest-ServerActionsTestOtherA-263171523 tempest-ServerActionsTestOtherA-263171523-project-member] [instance: 2a80d267-c2f0-4745-b23b-24717e4d9531] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.585847] env[68571]: DEBUG nova.compute.manager [None req-418088bb-dc24-4b40-aeb7-a26e72151b8d tempest-ServerActionsTestOtherA-263171523 tempest-ServerActionsTestOtherA-263171523-project-member] [instance: 2a80d267-c2f0-4745-b23b-24717e4d9531] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.603476] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a8305619-a023-4f57-986a-79edf6d30c69 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "cf2dc0a3-47fe-4fa6-b7f0-3e03328c1cb9" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.208s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.611258] env[68571]: DEBUG oslo_concurrency.lockutils [None req-418088bb-dc24-4b40-aeb7-a26e72151b8d tempest-ServerActionsTestOtherA-263171523 tempest-ServerActionsTestOtherA-263171523-project-member] Lock "2a80d267-c2f0-4745-b23b-24717e4d9531" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 229.234s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.621483] env[68571]: DEBUG nova.compute.manager [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 466616ca-0cad-4561-b0d6-1e34e3243418] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.643500] env[68571]: DEBUG nova.compute.manager [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 466616ca-0cad-4561-b0d6-1e34e3243418] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.663707] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "466616ca-0cad-4561-b0d6-1e34e3243418" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.375s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.672654] env[68571]: DEBUG nova.compute.manager [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 4bc7288c-3483-46ed-9c4f-673f86b10446] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.697294] env[68571]: DEBUG nova.compute.manager [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] [instance: 4bc7288c-3483-46ed-9c4f-673f86b10446] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.718459] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ed62991-e7ab-4982-b992-98a17ee108de tempest-MultipleCreateTestJSON-976188141 tempest-MultipleCreateTestJSON-976188141-project-member] Lock "4bc7288c-3483-46ed-9c4f-673f86b10446" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.407s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.727354] env[68571]: DEBUG nova.compute.manager [None req-ecc40a7d-e38c-4d8a-b590-07d74b57e720 tempest-ServersTestBootFromVolume-614723402 tempest-ServersTestBootFromVolume-614723402-project-member] [instance: d910cc12-8da2-4ce6-9107-a54c870405de] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.750538] env[68571]: DEBUG nova.compute.manager [None req-ecc40a7d-e38c-4d8a-b590-07d74b57e720 tempest-ServersTestBootFromVolume-614723402 tempest-ServersTestBootFromVolume-614723402-project-member] [instance: d910cc12-8da2-4ce6-9107-a54c870405de] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.771287] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ecc40a7d-e38c-4d8a-b590-07d74b57e720 tempest-ServersTestBootFromVolume-614723402 tempest-ServersTestBootFromVolume-614723402-project-member] Lock "d910cc12-8da2-4ce6-9107-a54c870405de" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 226.392s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.780386] env[68571]: DEBUG nova.compute.manager [None req-b3e1e414-f2fb-4ac5-b66d-778a4434410c tempest-ServerActionsTestOtherB-126684617 tempest-ServerActionsTestOtherB-126684617-project-member] [instance: d5416006-57fd-4966-90d2-3ba18d3eceba] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.806453] env[68571]: DEBUG nova.compute.manager [None req-b3e1e414-f2fb-4ac5-b66d-778a4434410c tempest-ServerActionsTestOtherB-126684617 tempest-ServerActionsTestOtherB-126684617-project-member] [instance: d5416006-57fd-4966-90d2-3ba18d3eceba] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.832908] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3e1e414-f2fb-4ac5-b66d-778a4434410c tempest-ServerActionsTestOtherB-126684617 tempest-ServerActionsTestOtherB-126684617-project-member] Lock "d5416006-57fd-4966-90d2-3ba18d3eceba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 221.344s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.842595] env[68571]: DEBUG nova.compute.manager [None req-766f1b84-25ef-4273-8acd-5c0cd23d89bb tempest-ServersTestManualDisk-226199740 tempest-ServersTestManualDisk-226199740-project-member] [instance: 812bbf03-e2c0-4827-8ed9-cc60611a77ca] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.870287] env[68571]: DEBUG nova.compute.manager [None req-766f1b84-25ef-4273-8acd-5c0cd23d89bb tempest-ServersTestManualDisk-226199740 tempest-ServersTestManualDisk-226199740-project-member] [instance: 812bbf03-e2c0-4827-8ed9-cc60611a77ca] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.890272] env[68571]: DEBUG oslo_concurrency.lockutils [None req-766f1b84-25ef-4273-8acd-5c0cd23d89bb tempest-ServersTestManualDisk-226199740 tempest-ServersTestManualDisk-226199740-project-member] Lock "812bbf03-e2c0-4827-8ed9-cc60611a77ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.327s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.899546] env[68571]: DEBUG nova.compute.manager [None req-6a1fc207-4e8b-4e1e-b831-20b107b2b88f tempest-ServerPasswordTestJSON-1255952750 tempest-ServerPasswordTestJSON-1255952750-project-member] [instance: 4c70fd0e-9872-423e-8b7e-4c17760d88bc] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.921954] env[68571]: DEBUG nova.compute.manager [None req-6a1fc207-4e8b-4e1e-b831-20b107b2b88f tempest-ServerPasswordTestJSON-1255952750 tempest-ServerPasswordTestJSON-1255952750-project-member] [instance: 4c70fd0e-9872-423e-8b7e-4c17760d88bc] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1043.943386] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a1fc207-4e8b-4e1e-b831-20b107b2b88f tempest-ServerPasswordTestJSON-1255952750 tempest-ServerPasswordTestJSON-1255952750-project-member] Lock "4c70fd0e-9872-423e-8b7e-4c17760d88bc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.876s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1043.953467] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1043.999799] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1044.000046] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1044.001432] env[68571]: INFO nova.compute.claims [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1044.315526] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbbbc5ed-bcd9-483f-ae19-bbd6e410bc30 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.323269] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfc5987a-1330-4b2c-bc82-61413b985a70 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.352256] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-563bca33-15d9-4490-84f1-a072949e957d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.359297] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4571876-d628-4192-b67d-657bb8a15374 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.372066] env[68571]: DEBUG nova.compute.provider_tree [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1044.381050] env[68571]: DEBUG nova.scheduler.client.report [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1044.398346] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.398s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1044.398852] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1044.437563] env[68571]: DEBUG nova.compute.utils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1044.439030] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1044.439030] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1044.452379] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1044.520215] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1044.527094] env[68571]: DEBUG nova.policy [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd07cf307b20444739ac3b45f645d5a06', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '93bc72f3b9714240946b1295a142f5ee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1044.548869] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1044.549179] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1044.549179] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1044.549266] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1044.550922] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1044.550922] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1044.550922] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1044.550922] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1044.550922] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1044.551256] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1044.551256] env[68571]: DEBUG nova.virt.hardware [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1044.551323] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65bb29a3-23b2-4cf9-83f6-861ad1f9c52f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.560359] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a659407d-ab72-43c8-bb12-2bd78ea47c61 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1044.856751] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Successfully created port: 07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1045.410110] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Successfully updated port: 07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1045.428027] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1045.429805] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1045.429805] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1045.470536] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1045.571999] env[68571]: DEBUG nova.compute.manager [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Received event network-vif-plugged-07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1045.572354] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Acquiring lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1045.572694] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1045.572805] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1045.572975] env[68571]: DEBUG nova.compute.manager [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] No waiting events found dispatching network-vif-plugged-07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1045.573330] env[68571]: WARNING nova.compute.manager [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Received unexpected event network-vif-plugged-07d2c0d2-115a-40d0-871f-b637bf2344ae for instance with vm_state building and task_state spawning. [ 1045.573510] env[68571]: DEBUG nova.compute.manager [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Received event network-changed-07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1045.573742] env[68571]: DEBUG nova.compute.manager [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Refreshing instance network info cache due to event network-changed-07d2c0d2-115a-40d0-871f-b637bf2344ae. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1045.573822] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Acquiring lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1045.639034] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Updating instance_info_cache with network_info: [{"id": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "address": "fa:16:3e:ec:e9:fb", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07d2c0d2-11", "ovs_interfaceid": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1045.650366] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1045.650666] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance network_info: |[{"id": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "address": "fa:16:3e:ec:e9:fb", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07d2c0d2-11", "ovs_interfaceid": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1045.650965] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Acquired lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1045.651155] env[68571]: DEBUG nova.network.neutron [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Refreshing network info cache for port 07d2c0d2-115a-40d0-871f-b637bf2344ae {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1045.652556] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ec:e9:fb', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '07d2c0d2-115a-40d0-871f-b637bf2344ae', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1045.662513] env[68571]: DEBUG oslo.service.loopingcall [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1045.665333] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1045.665789] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6cb8b111-caa7-443c-8d9f-e35ed9cafa3d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1045.686956] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1045.686956] env[68571]: value = "task-3467665" [ 1045.686956] env[68571]: _type = "Task" [ 1045.686956] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1045.695048] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467665, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1046.109213] env[68571]: DEBUG nova.network.neutron [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Updated VIF entry in instance network info cache for port 07d2c0d2-115a-40d0-871f-b637bf2344ae. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1046.109806] env[68571]: DEBUG nova.network.neutron [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Updating instance_info_cache with network_info: [{"id": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "address": "fa:16:3e:ec:e9:fb", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap07d2c0d2-11", "ovs_interfaceid": "07d2c0d2-115a-40d0-871f-b637bf2344ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1046.118928] env[68571]: DEBUG oslo_concurrency.lockutils [req-bbec855c-486a-43ff-a60e-877a68aad147 req-acb1fb67-aaff-4824-a003-0bef30cb22d7 service nova] Releasing lock "refresh_cache-87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1046.196519] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467665, 'name': CreateVM_Task, 'duration_secs': 0.295651} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1046.196678] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1046.197400] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1046.197567] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1046.197920] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1046.198185] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ed389818-4800-4696-8576-71d926f191c2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1046.202696] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 1046.202696] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5248a774-10ba-4ed5-5f5e-6d74663a1d64" [ 1046.202696] env[68571]: _type = "Task" [ 1046.202696] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1046.210105] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5248a774-10ba-4ed5-5f5e-6d74663a1d64, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1046.378035] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1046.713610] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1046.713918] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1046.714076] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1055.489177] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.483977] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.488569] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.488753] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.488911] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1056.503334] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.503612] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1056.503698] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1056.503830] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1056.504986] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-65aee18f-940c-48d8-93cc-abe655c44fc1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.514695] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9caf7844-5ab0-4375-a382-38988c6af46d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.530067] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a4c8822d-23c8-4968-99e4-88c91e3cd999 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.536777] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d71d44ed-11a0-47cd-b73b-e59c6f2b17b6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1056.567400] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180876MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1056.567559] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1056.567771] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1056.649054] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649245] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649376] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649499] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649620] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649739] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.649943] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.650096] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.650220] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.650347] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1056.661828] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.671964] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.682044] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 14dee505-e30a-4395-9fe3-fb505492c4df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.696903] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73f10282-d15a-4d6b-a0b9-5b3cb8764ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.706898] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e9b8ab85-e972-4081-ae38-602a92fe3ab9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.717760] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.731434] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a137e14-98ec-4718-8ff4-3700d2ef7ee9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.741898] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9cace51b-100c-48d0-813c-eb31ec9384ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.752978] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c8be0938-4b38-4e05-8afa-202d87a315b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.764425] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.774024] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2c21a8e5-da7f-4b3a-97ab-ec35f794edac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.784126] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5af733d9-dfa4-4059-8e33-1818695c8692 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.793539] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a333a6c9-5119-4d2f-81f3-cb86795ed364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.804382] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.814226] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.824552] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466d2eae-c109-4286-a223-edca73d6c8fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1056.824794] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1056.824942] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1057.217886] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-70be481d-8f61-4eb5-8869-462aa03bec31 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.225895] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b6dbd0ac-bf89-428a-872c-85474d1786f5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.257193] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4e9122d0-8dee-4fc4-b990-5326233de898 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.264316] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8ad5da2-cb5d-4860-975d-a5c17e5ad302 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1057.277272] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1057.287725] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1057.305838] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1057.306065] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.738s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1060.307581] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.489591] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.489787] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1060.489930] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1060.536517] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.536851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.536851] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.536932] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537069] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537194] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537314] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537430] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537547] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537661] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1060.537779] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1060.538328] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1060.539033] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1061.417321] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.417579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.442175] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "a1253c3f-921b-4417-a8fb-22168474f9c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.442287] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "a1253c3f-921b-4417-a8fb-22168474f9c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1061.468020] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "b5c24d31-97f5-4b9b-a08e-4006a1d5d316" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1061.468020] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "b5c24d31-97f5-4b9b-a08e-4006a1d5d316" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1062.490284] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1091.535596] env[68571]: WARNING oslo_vmware.rw_handles [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1091.535596] env[68571]: ERROR oslo_vmware.rw_handles [ 1091.536480] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1091.538030] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1091.538302] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Copying Virtual Disk [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/c4e99682-09b0-463b-a58f-3c69c48b07cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1091.538604] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-5be15b20-3bb5-4b74-b6c3-ffb300e6271f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1091.546822] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for the task: (returnval){ [ 1091.546822] env[68571]: value = "task-3467666" [ 1091.546822] env[68571]: _type = "Task" [ 1091.546822] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1091.554676] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Task: {'id': task-3467666, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1092.057807] env[68571]: DEBUG oslo_vmware.exceptions [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1092.058095] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1092.058655] env[68571]: ERROR nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.058655] env[68571]: Faults: ['InvalidArgument'] [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Traceback (most recent call last): [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] yield resources [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self.driver.spawn(context, instance, image_meta, [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self._fetch_image_if_missing(context, vi) [ 1092.058655] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] image_cache(vi, tmp_image_ds_loc) [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] vm_util.copy_virtual_disk( [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] session._wait_for_task(vmdk_copy_task) [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return self.wait_for_task(task_ref) [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return evt.wait() [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] result = hub.switch() [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1092.059112] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return self.greenlet.switch() [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self.f(*self.args, **self.kw) [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] raise exceptions.translate_fault(task_info.error) [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Faults: ['InvalidArgument'] [ 1092.059530] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] [ 1092.059530] env[68571]: INFO nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Terminating instance [ 1092.060529] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1092.060738] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.060971] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7b903fcf-8df0-4f29-9657-93320f4ff308 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.063991] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1092.064201] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1092.064892] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c849c11-2458-4ecb-afa7-329ca6cb54ba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.071480] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1092.071687] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2b4feb24-0669-4dd4-a421-24966ed04e5a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.073816] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.073982] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1092.074895] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c87cbe77-e82f-472a-9115-8e92cb9e811b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.080086] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for the task: (returnval){ [ 1092.080086] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ea3775-55cc-f9e5-858e-b0aece96a78e" [ 1092.080086] env[68571]: _type = "Task" [ 1092.080086] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1092.087192] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ea3775-55cc-f9e5-858e-b0aece96a78e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1092.136894] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1092.137125] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1092.137310] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Deleting the datastore file [datastore1] ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1092.137654] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-8f2c3415-769b-432e-8407-e79074c209c5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.143729] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for the task: (returnval){ [ 1092.143729] env[68571]: value = "task-3467668" [ 1092.143729] env[68571]: _type = "Task" [ 1092.143729] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1092.151348] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Task: {'id': task-3467668, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1092.590133] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1092.590389] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Creating directory with path [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1092.590625] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7a27a885-4739-42bc-a90b-85a63f546b9d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.602950] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Created directory with path [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1092.603194] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Fetch image to [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1092.603338] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1092.604084] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d70e65b1-dca5-4a81-8315-6e71f034f9c7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.610570] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-14b3421e-9820-4bb6-bbae-f8a699f24bb9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.619477] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e69c452-b445-4dec-942e-54d4987f6458 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.653869] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f39f9948-320b-4078-8dba-9d3a76d81737 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.661366] env[68571]: DEBUG oslo_vmware.api [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Task: {'id': task-3467668, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075377} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1092.663460] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1092.663460] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1092.663460] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1092.663634] env[68571]: INFO nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1092.665457] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-80642708-71d9-4bde-ad49-644bee4753df {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1092.667410] env[68571]: DEBUG nova.compute.claims [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1092.667586] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1092.667795] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1092.696429] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1092.749360] env[68571]: DEBUG oslo_vmware.rw_handles [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1092.813414] env[68571]: DEBUG oslo_vmware.rw_handles [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1092.813630] env[68571]: DEBUG oslo_vmware.rw_handles [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1093.116320] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3352499-c4c7-4043-8855-c35385f39e77 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.123601] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-884a68bd-4651-4ec3-970e-5d3dd85df885 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.153472] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edb42ca8-cd75-4743-a0e7-a0a02255ac9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.160229] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-340ce1a7-6863-4347-9c76-2c6a193b31d8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.175235] env[68571]: DEBUG nova.compute.provider_tree [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1093.184292] env[68571]: DEBUG nova.scheduler.client.report [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1093.198676] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.531s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1093.199232] env[68571]: ERROR nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1093.199232] env[68571]: Faults: ['InvalidArgument'] [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Traceback (most recent call last): [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self.driver.spawn(context, instance, image_meta, [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self._fetch_image_if_missing(context, vi) [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] image_cache(vi, tmp_image_ds_loc) [ 1093.199232] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] vm_util.copy_virtual_disk( [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] session._wait_for_task(vmdk_copy_task) [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return self.wait_for_task(task_ref) [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return evt.wait() [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] result = hub.switch() [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] return self.greenlet.switch() [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1093.199638] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] self.f(*self.args, **self.kw) [ 1093.200034] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1093.200034] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] raise exceptions.translate_fault(task_info.error) [ 1093.200034] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1093.200034] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Faults: ['InvalidArgument'] [ 1093.200034] env[68571]: ERROR nova.compute.manager [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] [ 1093.200034] env[68571]: DEBUG nova.compute.utils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1093.201593] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Build of instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 was re-scheduled: A specified parameter was not correct: fileType [ 1093.201593] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1093.201966] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1093.202160] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1093.202367] env[68571]: DEBUG nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1093.202536] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.509891] env[68571]: DEBUG nova.network.neutron [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1093.520579] env[68571]: INFO nova.compute.manager [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Took 0.32 seconds to deallocate network for instance. [ 1093.612642] env[68571]: INFO nova.scheduler.client.report [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Deleted allocations for instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 [ 1093.632310] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bd53242f-7bd5-4fc8-98ee-3466f773312e tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 523.179s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1093.633468] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 326.274s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1093.633697] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Acquiring lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1093.633904] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1093.634087] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1093.636107] env[68571]: INFO nova.compute.manager [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Terminating instance [ 1093.637749] env[68571]: DEBUG nova.compute.manager [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1093.637969] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1093.638469] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f2f06212-dda7-45f6-93d6-c1d681332c68 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.647816] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9c97e1e1-e70d-4bd6-949d-e2b5b68001de {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1093.660229] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1093.681677] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2 could not be found. [ 1093.681888] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1093.682070] env[68571]: INFO nova.compute.manager [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1093.682339] env[68571]: DEBUG oslo.service.loopingcall [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1093.682563] env[68571]: DEBUG nova.compute.manager [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1093.682661] env[68571]: DEBUG nova.network.neutron [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1093.706905] env[68571]: DEBUG nova.network.neutron [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1093.716160] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1093.716248] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1093.718281] env[68571]: INFO nova.compute.claims [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1093.720939] env[68571]: INFO nova.compute.manager [-] [instance: ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2] Took 0.04 seconds to deallocate network for instance. [ 1093.813492] env[68571]: DEBUG oslo_concurrency.lockutils [None req-87e44a4f-5976-4463-bba6-3ad40cef6ca9 tempest-ImagesOneServerNegativeTestJSON-453802795 tempest-ImagesOneServerNegativeTestJSON-453802795-project-member] Lock "ee103d2f-8513-4e3a-af13-bfb3b5a7fbd2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.180s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1094.102459] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2afcd57d-94df-42f5-866e-f51601bbf5a8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.110383] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1989804-7f79-44db-a103-19fd9729ddcd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.139971] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b0496c9d-04af-49a3-ab3a-f250656a94fd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.147762] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9bc466f1-a61c-490d-b383-96acc8fb0d18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.163012] env[68571]: DEBUG nova.compute.provider_tree [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1094.174300] env[68571]: DEBUG nova.scheduler.client.report [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1094.190914] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.475s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1094.191433] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1094.241017] env[68571]: DEBUG nova.compute.utils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1094.242454] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1094.242658] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1094.250834] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1094.321130] env[68571]: DEBUG nova.policy [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd6341e1a4fba4ab895f34e82050aec27', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '30e2bcd11a624c55826d6a0216659958', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1094.322859] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1094.350115] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1094.350115] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1094.350115] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1094.350334] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1094.350334] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1094.350334] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1094.350650] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1094.350988] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1094.351323] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1094.351626] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1094.351947] env[68571]: DEBUG nova.virt.hardware [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1094.353186] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8ff08514-f53f-4883-9a50-3c8b1dc2e921 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.361660] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad1d9015-8e68-4bd6-9ae1-0a0947443767 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1094.767465] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Successfully created port: d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1095.369566] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Successfully updated port: d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1095.384361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1095.384361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquired lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1095.384361] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1095.420623] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1095.584753] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Updating instance_info_cache with network_info: [{"id": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "address": "fa:16:3e:80:5d:82", "network": {"id": "61486ec9-eeea-4378-b3bf-6f58b7222c52", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-102410635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30e2bcd11a624c55826d6a0216659958", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd93a0bf7-26", "ovs_interfaceid": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.597847] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Releasing lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1095.598218] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance network_info: |[{"id": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "address": "fa:16:3e:80:5d:82", "network": {"id": "61486ec9-eeea-4378-b3bf-6f58b7222c52", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-102410635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30e2bcd11a624c55826d6a0216659958", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd93a0bf7-26", "ovs_interfaceid": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1095.598672] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:80:5d:82', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '6b8137fc-f23d-49b1-b19c-3123a5588f34', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd93a0bf7-26fe-435d-8f79-49fd68aba012', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1095.606766] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Creating folder: Project (30e2bcd11a624c55826d6a0216659958). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.607352] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-153c3f93-006c-4309-a1f6-d671442e99a5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.619270] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Created folder: Project (30e2bcd11a624c55826d6a0216659958) in parent group-v692787. [ 1095.619481] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Creating folder: Instances. Parent ref: group-v692852. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1095.619752] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-8d8fcac4-d7b5-44d9-abe4-6442af583b92 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.625659] env[68571]: DEBUG nova.compute.manager [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Received event network-vif-plugged-d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1095.625861] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Acquiring lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1095.626077] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1095.626269] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1095.626462] env[68571]: DEBUG nova.compute.manager [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] No waiting events found dispatching network-vif-plugged-d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1095.626620] env[68571]: WARNING nova.compute.manager [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Received unexpected event network-vif-plugged-d93a0bf7-26fe-435d-8f79-49fd68aba012 for instance with vm_state building and task_state spawning. [ 1095.626784] env[68571]: DEBUG nova.compute.manager [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Received event network-changed-d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1095.626980] env[68571]: DEBUG nova.compute.manager [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Refreshing instance network info cache due to event network-changed-d93a0bf7-26fe-435d-8f79-49fd68aba012. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1095.627144] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Acquiring lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1095.627295] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Acquired lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1095.627442] env[68571]: DEBUG nova.network.neutron [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Refreshing network info cache for port d93a0bf7-26fe-435d-8f79-49fd68aba012 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1095.629755] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Created folder: Instances in parent group-v692852. [ 1095.629980] env[68571]: DEBUG oslo.service.loopingcall [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1095.630348] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1095.630550] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0e556a27-ce69-4e0b-8ce2-bfb652329414 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1095.652387] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1095.652387] env[68571]: value = "task-3467671" [ 1095.652387] env[68571]: _type = "Task" [ 1095.652387] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1095.659788] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467671, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1095.897191] env[68571]: DEBUG nova.network.neutron [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Updated VIF entry in instance network info cache for port d93a0bf7-26fe-435d-8f79-49fd68aba012. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1095.897687] env[68571]: DEBUG nova.network.neutron [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Updating instance_info_cache with network_info: [{"id": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "address": "fa:16:3e:80:5d:82", "network": {"id": "61486ec9-eeea-4378-b3bf-6f58b7222c52", "bridge": "br-int", "label": "tempest-ServersNegativeTestJSON-102410635-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "30e2bcd11a624c55826d6a0216659958", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "6b8137fc-f23d-49b1-b19c-3123a5588f34", "external-id": "nsx-vlan-transportzone-709", "segmentation_id": 709, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd93a0bf7-26", "ovs_interfaceid": "d93a0bf7-26fe-435d-8f79-49fd68aba012", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1095.909048] env[68571]: DEBUG oslo_concurrency.lockutils [req-54f70bca-df17-40d5-896f-7238ae976d71 req-14982d20-81c0-49aa-b965-1e60969907e2 service nova] Releasing lock "refresh_cache-b90ac11a-50c6-4d12-a545-ccd92243e6ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1096.167022] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467671, 'name': CreateVM_Task, 'duration_secs': 0.275124} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1096.167022] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1096.167022] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1096.167022] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1096.167022] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1096.167283] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d878213f-976e-4ed2-9d1d-388425b0fd1f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1096.173832] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for the task: (returnval){ [ 1096.173832] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e144e6-206c-4654-bb35-f077b5afeb53" [ 1096.173832] env[68571]: _type = "Task" [ 1096.173832] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1096.181620] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e144e6-206c-4654-bb35-f077b5afeb53, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1096.683857] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1096.684125] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1096.684341] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1111.134428] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1111.134740] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1113.489795] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1113.490045] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1114.460489] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1114.482557] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 1114.482557] env[68571]: value = "domain-c8" [ 1114.482557] env[68571]: _type = "ClusterComputeResource" [ 1114.482557] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1114.483918] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f39e51cd-15f4-4fe6-a800-88a0a8321509 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1114.501401] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 10 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1114.501701] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.501765] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 3adaf481-5844-45ac-8dc9-eb396a47ed1c {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.501920] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid c962c9c7-04a4-46ec-a46f-fac13caa6a1e {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.502085] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 25f17a16-f752-4927-a2a5-73f1f18e5c8c {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.502241] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 244ba708-279e-440e-bc18-8c6ee7b83250 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.502386] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.503015] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid b60eb700-434f-4bea-a84f-9071402001c3 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.503015] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 5e571ae2-9d45-402d-bce5-6e3721cc5374 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.503015] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.503015] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid b90ac11a-50c6-4d12-a545-ccd92243e6ca {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1114.503342] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.503562] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504421] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504421] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504421] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "244ba708-279e-440e-bc18-8c6ee7b83250" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504421] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504636] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "b60eb700-434f-4bea-a84f-9071402001c3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504727] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.504921] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1114.505128] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.489885] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1116.490232] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1116.501347] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.501565] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.501730] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1116.502287] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1116.503028] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-18fb0e20-1db3-411b-9381-d0cd7c512a34 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.511950] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47441226-c883-47d5-9661-b76450c34855 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.526125] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-39ed7cef-09e9-4b6b-a3b1-d45c8dab2aff {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.532645] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32e88036-911c-4208-8b62-879ec722851a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1116.562363] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1116.562540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1116.562745] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1116.720937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.720937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.720937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.720937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721139] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721139] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721139] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721139] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721272] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.721272] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1116.731674] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.742395] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 14dee505-e30a-4395-9fe3-fb505492c4df has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.752950] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73f10282-d15a-4d6b-a0b9-5b3cb8764ff9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.763167] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e9b8ab85-e972-4081-ae38-602a92fe3ab9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.772264] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.780905] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a137e14-98ec-4718-8ff4-3700d2ef7ee9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.789468] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9cace51b-100c-48d0-813c-eb31ec9384ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.797995] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c8be0938-4b38-4e05-8afa-202d87a315b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.806449] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.814837] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2c21a8e5-da7f-4b3a-97ab-ec35f794edac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.823546] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5af733d9-dfa4-4059-8e33-1818695c8692 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.832099] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a333a6c9-5119-4d2f-81f3-cb86795ed364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.840724] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.850152] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.858773] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466d2eae-c109-4286-a223-edca73d6c8fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.866947] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.875028] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a1253c3f-921b-4417-a8fb-22168474f9c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.882975] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b5c24d31-97f5-4b9b-a08e-4006a1d5d316 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.891198] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1116.891421] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1116.891570] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1116.907621] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1116.921540] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1116.921716] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1116.931767] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1116.949532] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1117.257605] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-167284f5-395d-48c5-8d41-cbe14e52b913 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.265641] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b7ed4a9f-4cff-489b-a2e6-a87e7f4391d9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.295480] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c41b6c5-3cbd-4633-ab93-30e0213cc1b9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.302776] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb09d6d3-b39a-4b1e-a26d-ac7c2015cc71 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1117.316093] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1117.326013] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1117.340812] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1117.341008] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.778s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1118.335966] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1118.336353] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1118.489485] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1119.484190] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1121.489720] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.489566] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.489753] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1122.489979] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1122.510295] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.513820] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.513820] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.513820] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.513820] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.513820] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.514124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.514124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.514124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.514124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1122.514124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1122.514297] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.514297] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1122.514297] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1123.489599] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1123.489770] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1123.501685] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 0 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1123.501921] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1139.211779] env[68571]: WARNING oslo_vmware.rw_handles [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1139.211779] env[68571]: ERROR oslo_vmware.rw_handles [ 1139.212472] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1139.214212] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1139.214445] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Copying Virtual Disk [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/c26c4ce8-e1a2-4137-b991-aee21092884b/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1139.214733] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-0741b7bb-6d2b-483d-bd75-8ea52b2c15cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.222873] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for the task: (returnval){ [ 1139.222873] env[68571]: value = "task-3467672" [ 1139.222873] env[68571]: _type = "Task" [ 1139.222873] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1139.230095] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Task: {'id': task-3467672, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1139.577207] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1139.733978] env[68571]: DEBUG oslo_vmware.exceptions [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1139.734320] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1139.734857] env[68571]: ERROR nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1139.734857] env[68571]: Faults: ['InvalidArgument'] [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Traceback (most recent call last): [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] yield resources [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self.driver.spawn(context, instance, image_meta, [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self._fetch_image_if_missing(context, vi) [ 1139.734857] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] image_cache(vi, tmp_image_ds_loc) [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] vm_util.copy_virtual_disk( [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] session._wait_for_task(vmdk_copy_task) [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return self.wait_for_task(task_ref) [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return evt.wait() [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] result = hub.switch() [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1139.735252] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return self.greenlet.switch() [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self.f(*self.args, **self.kw) [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] raise exceptions.translate_fault(task_info.error) [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Faults: ['InvalidArgument'] [ 1139.735603] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] [ 1139.735603] env[68571]: INFO nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Terminating instance [ 1139.736684] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1139.736929] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1139.737210] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-cd2d7a2d-6e2b-4a04-91d9-2537fd1fd054 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.739422] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1139.739645] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1139.740401] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3d0dada-d6c4-475f-b0c6-43acb91b13c9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.747626] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1139.747835] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b485887a-c642-41d4-b876-cddfccbba892 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.750025] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1139.750205] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1139.751155] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c33e5077-beac-4866-9cf1-4df07da3e39d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.755920] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for the task: (returnval){ [ 1139.755920] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52c5e3cf-9cf6-1eee-e80c-b0a62a654b1b" [ 1139.755920] env[68571]: _type = "Task" [ 1139.755920] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1139.767665] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52c5e3cf-9cf6-1eee-e80c-b0a62a654b1b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1139.818895] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1139.819145] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1139.819330] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Deleting the datastore file [datastore1] 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1139.819603] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-50675028-d920-4db1-9753-a0ac3362e2f8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1139.828156] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for the task: (returnval){ [ 1139.828156] env[68571]: value = "task-3467674" [ 1139.828156] env[68571]: _type = "Task" [ 1139.828156] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1139.834387] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Task: {'id': task-3467674, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1140.266018] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1140.266383] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Creating directory with path [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1140.266517] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edcfd76a-84d2-4702-8e0e-8c556efda3d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.277294] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Created directory with path [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1140.277477] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Fetch image to [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1140.277645] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1140.278357] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2b5b34f-4ad0-4365-b137-f459d51d5ffd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.284820] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c772b27e-32a5-4a3d-96ff-687bec158bbb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.294659] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7d173b1-ae72-4938-a6f1-79597a1bec28 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.324321] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-888c9592-39f0-4ce4-b7b3-f216520922c5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.332209] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-03edcbef-fcf0-4f1e-af50-311ebcdd7c09 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.336339] env[68571]: DEBUG oslo_vmware.api [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Task: {'id': task-3467674, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072815} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1140.336821] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1140.337046] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1140.337235] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1140.337407] env[68571]: INFO nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1140.339460] env[68571]: DEBUG nova.compute.claims [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1140.339645] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1140.339858] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1140.354826] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1140.404924] env[68571]: DEBUG oslo_vmware.rw_handles [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1140.465305] env[68571]: DEBUG oslo_vmware.rw_handles [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1140.465305] env[68571]: DEBUG oslo_vmware.rw_handles [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1140.765567] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-060e3c5f-6e55-4f98-8f90-c2679d9456e5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.773331] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33a28e0e-8dff-4a42-b92c-c5b3889387cb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.802928] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ef59d81-35fd-41e5-b7ff-ff4aab65cde8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.809956] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-982febd4-39d0-4abc-b4dd-c1a3972e4226 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1140.822588] env[68571]: DEBUG nova.compute.provider_tree [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1140.832827] env[68571]: DEBUG nova.scheduler.client.report [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1140.847748] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.508s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1140.848251] env[68571]: ERROR nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1140.848251] env[68571]: Faults: ['InvalidArgument'] [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Traceback (most recent call last): [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self.driver.spawn(context, instance, image_meta, [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self._fetch_image_if_missing(context, vi) [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] image_cache(vi, tmp_image_ds_loc) [ 1140.848251] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] vm_util.copy_virtual_disk( [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] session._wait_for_task(vmdk_copy_task) [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return self.wait_for_task(task_ref) [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return evt.wait() [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] result = hub.switch() [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] return self.greenlet.switch() [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1140.848788] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] self.f(*self.args, **self.kw) [ 1140.849347] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1140.849347] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] raise exceptions.translate_fault(task_info.error) [ 1140.849347] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1140.849347] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Faults: ['InvalidArgument'] [ 1140.849347] env[68571]: ERROR nova.compute.manager [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] [ 1140.849347] env[68571]: DEBUG nova.compute.utils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1140.850414] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Build of instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 was re-scheduled: A specified parameter was not correct: fileType [ 1140.850414] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1140.850794] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1140.850971] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1140.851163] env[68571]: DEBUG nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1140.851325] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1141.170103] env[68571]: DEBUG nova.network.neutron [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1141.185272] env[68571]: INFO nova.compute.manager [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Took 0.33 seconds to deallocate network for instance. [ 1141.302059] env[68571]: INFO nova.scheduler.client.report [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Deleted allocations for instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 [ 1141.324222] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39e69776-e19d-4431-b2ec-a541079fc4bf tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 569.784s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.325319] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 372.056s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.325535] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Acquiring lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1141.325735] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.325900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.327881] env[68571]: INFO nova.compute.manager [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Terminating instance [ 1141.329478] env[68571]: DEBUG nova.compute.manager [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1141.329667] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1141.330126] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a99b1215-29e2-4d62-bb32-e86536c4d8ed {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.340061] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc3078bb-35e1-4635-ad64-3879cd8a2ee2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.350431] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1141.374230] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2 could not be found. [ 1141.374431] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1141.374606] env[68571]: INFO nova.compute.manager [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1141.374846] env[68571]: DEBUG oslo.service.loopingcall [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1141.375088] env[68571]: DEBUG nova.compute.manager [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1141.375187] env[68571]: DEBUG nova.network.neutron [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1141.400931] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1141.401191] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.402665] env[68571]: INFO nova.compute.claims [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1141.405211] env[68571]: DEBUG nova.network.neutron [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1141.414854] env[68571]: INFO nova.compute.manager [-] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] Took 0.04 seconds to deallocate network for instance. [ 1141.504597] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0fbaf758-ac4c-4d68-b4ff-90c926cf1009 tempest-ServerAddressesNegativeTestJSON-2092165845 tempest-ServerAddressesNegativeTestJSON-2092165845-project-member] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.505510] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 27.002s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1141.505510] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4a43ba00-1df6-4f10-a4ce-37c4ae353cc2] During sync_power_state the instance has a pending task (deleting). Skip. [ 1141.505639] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "4a43ba00-1df6-4f10-a4ce-37c4ae353cc2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.756808] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1067bbac-ecea-4a69-99b5-0d06b4820f7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.764377] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd54460a-0686-43d0-a4d9-fb427a5ca0e1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.793600] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83f9b7f3-f576-4395-9c2f-1f109a7dfd78 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.800561] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23c73814-9336-480d-86cb-dfe3e4ab30fe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.813843] env[68571]: DEBUG nova.compute.provider_tree [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1141.824105] env[68571]: DEBUG nova.scheduler.client.report [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1141.838714] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.437s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1141.839191] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1141.873726] env[68571]: DEBUG nova.compute.utils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1141.877021] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1141.877021] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1141.884304] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1141.936436] env[68571]: DEBUG nova.policy [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd34e5361b36c4dc5824b0f42a37e6bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '290427ab03f446ce9297ea393c083ff9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1141.949740] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1141.974610] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1141.974749] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1141.975634] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1141.975634] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1141.975634] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1141.975634] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1141.975634] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1141.975884] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1141.975925] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1141.979018] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1141.979018] env[68571]: DEBUG nova.virt.hardware [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1141.979018] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfcde08d-627d-4e73-90a6-7fd1bfc89052 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1141.985470] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-95044b1c-866f-4879-84b2-b32cebe4f31b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1142.258159] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Successfully created port: 25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1142.840645] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Successfully updated port: 25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1142.853297] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1142.853446] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1142.853596] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1142.890845] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1143.242319] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Updating instance_info_cache with network_info: [{"id": "25be0d11-97e9-4f39-8f7a-6937836245a2", "address": "fa:16:3e:29:e0:e9", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25be0d11-97", "ovs_interfaceid": "25be0d11-97e9-4f39-8f7a-6937836245a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1143.248902] env[68571]: DEBUG nova.compute.manager [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Received event network-vif-plugged-25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1143.249028] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Acquiring lock "afe033a3-4e04-4249-beed-169a3e40a721-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1143.249221] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Lock "afe033a3-4e04-4249-beed-169a3e40a721-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1143.249420] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Lock "afe033a3-4e04-4249-beed-169a3e40a721-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1143.249932] env[68571]: DEBUG nova.compute.manager [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] No waiting events found dispatching network-vif-plugged-25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1143.249932] env[68571]: WARNING nova.compute.manager [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Received unexpected event network-vif-plugged-25be0d11-97e9-4f39-8f7a-6937836245a2 for instance with vm_state building and task_state spawning. [ 1143.249932] env[68571]: DEBUG nova.compute.manager [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Received event network-changed-25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1143.250112] env[68571]: DEBUG nova.compute.manager [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Refreshing instance network info cache due to event network-changed-25be0d11-97e9-4f39-8f7a-6937836245a2. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1143.250155] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Acquiring lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1143.253702] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1143.253955] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance network_info: |[{"id": "25be0d11-97e9-4f39-8f7a-6937836245a2", "address": "fa:16:3e:29:e0:e9", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25be0d11-97", "ovs_interfaceid": "25be0d11-97e9-4f39-8f7a-6937836245a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1143.254273] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Acquired lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1143.254450] env[68571]: DEBUG nova.network.neutron [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Refreshing network info cache for port 25be0d11-97e9-4f39-8f7a-6937836245a2 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1143.255502] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:29:e0:e9', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '25be0d11-97e9-4f39-8f7a-6937836245a2', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1143.263202] env[68571]: DEBUG oslo.service.loopingcall [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1143.264206] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1143.266438] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e8e45ef4-22d6-43ed-b3fc-b27597b85262 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1143.286781] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1143.286781] env[68571]: value = "task-3467675" [ 1143.286781] env[68571]: _type = "Task" [ 1143.286781] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1143.294543] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467675, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1143.522798] env[68571]: DEBUG nova.network.neutron [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Updated VIF entry in instance network info cache for port 25be0d11-97e9-4f39-8f7a-6937836245a2. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1143.523289] env[68571]: DEBUG nova.network.neutron [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Updating instance_info_cache with network_info: [{"id": "25be0d11-97e9-4f39-8f7a-6937836245a2", "address": "fa:16:3e:29:e0:e9", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap25be0d11-97", "ovs_interfaceid": "25be0d11-97e9-4f39-8f7a-6937836245a2", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1143.533299] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e81945f-ba0b-4c66-81b0-e6b1b6c7d63a req-6e9bbd5f-5afb-4fc4-b44b-e27f2396b486 service nova] Releasing lock "refresh_cache-afe033a3-4e04-4249-beed-169a3e40a721" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1143.798145] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467675, 'name': CreateVM_Task, 'duration_secs': 0.337314} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1143.798371] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1143.798957] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1143.799135] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1143.799462] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1143.799719] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-025bb96f-8687-4b54-a7a8-72bcb0f0b9fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1143.804697] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1143.804697] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5222bac5-f001-2493-0460-0734fbf0ae2a" [ 1143.804697] env[68571]: _type = "Task" [ 1143.804697] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1143.812886] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5222bac5-f001-2493-0460-0734fbf0ae2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1144.315146] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1144.315463] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1144.315711] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1151.548573] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1151.548875] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1159.859516] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "afe033a3-4e04-4249-beed-169a3e40a721" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1166.855507] env[68571]: DEBUG oslo_concurrency.lockutils [None req-20ddc929-573a-4f9f-9533-91af9da978f0 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Acquiring lock "67209cb0-7bb2-4aed-969a-e0d208fbf71b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1166.855802] env[68571]: DEBUG oslo_concurrency.lockutils [None req-20ddc929-573a-4f9f-9533-91af9da978f0 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "67209cb0-7bb2-4aed-969a-e0d208fbf71b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.344614] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8effc918-eea1-41da-bdf4-ef4c5261e72c tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Acquiring lock "3cea970e-78f8-4b67-9350-65d3507f6b18" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.344860] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8effc918-eea1-41da-bdf4-ef4c5261e72c tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "3cea970e-78f8-4b67-9350-65d3507f6b18" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1167.892750] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19904ea2-dc4f-47ec-be0a-568e0a5e9077 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Acquiring lock "d62a50a6-fef2-42a8-a066-e36211c57f73" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1167.893288] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19904ea2-dc4f-47ec-be0a-568e0a5e9077 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "d62a50a6-fef2-42a8-a066-e36211c57f73" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1172.298275] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43190948-579f-4dfb-98b2-3f76e5c36d5b tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Acquiring lock "f0b9847b-9438-4be7-a081-db33dd3ff998" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1172.298630] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43190948-579f-4dfb-98b2-3f76e5c36d5b tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Lock "f0b9847b-9438-4be7-a081-db33dd3ff998" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.510542] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1176.522755] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.522906] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.523099] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1176.523354] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1176.524495] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-662c3400-92ea-473c-b5f5-d0455b07e1e3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.532918] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ea5a62d-c1b5-48d7-81ff-2a2184a908ea {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.548467] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad7b3ddc-5731-40a4-8695-17a05f59b683 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.554462] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-85292ce8-516b-4283-b99a-5c9e757be408 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1176.582569] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180929MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1176.582717] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1176.582900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1176.655640] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.655812] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.655942] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656079] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656201] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656320] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656436] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656548] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656661] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.656774] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1176.667187] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.676972] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7a137e14-98ec-4718-8ff4-3700d2ef7ee9 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.686605] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9cace51b-100c-48d0-813c-eb31ec9384ec has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.696311] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c8be0938-4b38-4e05-8afa-202d87a315b7 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.707206] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.716733] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 2c21a8e5-da7f-4b3a-97ab-ec35f794edac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.726614] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5af733d9-dfa4-4059-8e33-1818695c8692 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.735747] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a333a6c9-5119-4d2f-81f3-cb86795ed364 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.745507] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.755928] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.765516] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 466d2eae-c109-4286-a223-edca73d6c8fa has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.774634] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.784050] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a1253c3f-921b-4417-a8fb-22168474f9c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.792903] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b5c24d31-97f5-4b9b-a08e-4006a1d5d316 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.806864] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.817687] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.827119] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 67209cb0-7bb2-4aed-969a-e0d208fbf71b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.855264] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3cea970e-78f8-4b67-9350-65d3507f6b18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.866021] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d62a50a6-fef2-42a8-a066-e36211c57f73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.874723] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f0b9847b-9438-4be7-a081-db33dd3ff998 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1176.875018] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1176.875215] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1177.210066] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-650c54ce-defb-46ae-afe2-7b028f2b5cb9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.217783] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a94e906c-7920-47a3-90ce-9defaf624b66 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.246565] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4f495464-c7d1-40ee-9cdf-611cd0c2cc07 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.253632] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10f11da9-4f96-440d-b355-f0c978787f67 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1177.266457] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1177.274924] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1177.293858] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1177.294053] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.711s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1178.974169] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5a0d3411-b39d-4cd4-badf-063b65754298 tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Acquiring lock "b6a0771c-53cb-4503-bbc0-db992326b245" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1178.974169] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5a0d3411-b39d-4cd4-badf-063b65754298 tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "b6a0771c-53cb-4503-bbc0-db992326b245" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1179.267871] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.268131] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.268305] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1179.489663] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1181.859285] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ed43f1dd-787a-4c98-87dd-815ee2c472d6 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Acquiring lock "6532563b-5e91-409f-be05-084196087a4d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1181.859550] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ed43f1dd-787a-4c98-87dd-815ee2c472d6 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Lock "6532563b-5e91-409f-be05-084196087a4d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1182.490521] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1182.490710] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1182.490836] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1182.515459] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.515625] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.515757] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.515884] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516015] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516344] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516469] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516542] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516668] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516787] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1182.516919] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1183.489752] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.490099] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.490435] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1183.490645] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1186.816242] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aa1230a2-8b02-476b-9f56-cab3486b6af9 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "10b3cea3-b9d1-45b7-9ac8-b922952371ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1186.816563] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aa1230a2-8b02-476b-9f56-cab3486b6af9 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "10b3cea3-b9d1-45b7-9ac8-b922952371ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1187.188897] env[68571]: WARNING oslo_vmware.rw_handles [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1187.188897] env[68571]: ERROR oslo_vmware.rw_handles [ 1187.189437] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1187.191477] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1187.191729] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Copying Virtual Disk [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/dc43ff76-5964-47fb-93b7-0ce8372eb0f7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1187.192071] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ad8af0df-eefc-4d4e-883e-76d4f4c77337 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.201187] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for the task: (returnval){ [ 1187.201187] env[68571]: value = "task-3467676" [ 1187.201187] env[68571]: _type = "Task" [ 1187.201187] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.211178] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Task: {'id': task-3467676, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1187.712394] env[68571]: DEBUG oslo_vmware.exceptions [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1187.712704] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1187.713330] env[68571]: ERROR nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1187.713330] env[68571]: Faults: ['InvalidArgument'] [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] yield resources [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.driver.spawn(context, instance, image_meta, [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._fetch_image_if_missing(context, vi) [ 1187.713330] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] image_cache(vi, tmp_image_ds_loc) [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] vm_util.copy_virtual_disk( [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] session._wait_for_task(vmdk_copy_task) [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.wait_for_task(task_ref) [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return evt.wait() [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = hub.switch() [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1187.713795] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.greenlet.switch() [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.f(*self.args, **self.kw) [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exceptions.translate_fault(task_info.error) [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Faults: ['InvalidArgument'] [ 1187.714253] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1187.714253] env[68571]: INFO nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Terminating instance [ 1187.715242] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1187.715451] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1187.715704] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8f7b354b-8507-4d8b-b2c9-5b727b0a5b15 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.718678] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1187.718871] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1187.719596] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c244a21e-f7ff-4569-8e24-4009ec4cd603 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.727559] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1187.728592] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-41a13837-5fde-48bb-b33f-96f611c96027 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.730160] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1187.730336] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1187.731112] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5103ab9c-4f1e-48e7-87a9-2a9e4b1f32da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.735959] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for the task: (returnval){ [ 1187.735959] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5259cda1-c546-4faa-735a-0e1f44217631" [ 1187.735959] env[68571]: _type = "Task" [ 1187.735959] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.746014] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5259cda1-c546-4faa-735a-0e1f44217631, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1187.796436] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1187.796656] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1187.796830] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Deleting the datastore file [datastore1] 3adaf481-5844-45ac-8dc9-eb396a47ed1c {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1187.797124] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-af5d218d-dcf0-4b13-9afa-be60bee4d1ef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1187.804795] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for the task: (returnval){ [ 1187.804795] env[68571]: value = "task-3467678" [ 1187.804795] env[68571]: _type = "Task" [ 1187.804795] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1187.812631] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Task: {'id': task-3467678, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1188.246919] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1188.247201] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Creating directory with path [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1188.247433] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-edb2335a-84fd-48c7-9a71-a7559d17dce9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.259258] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Created directory with path [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1188.259644] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Fetch image to [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1188.259644] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1188.260371] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-66243f08-6bb7-4c6d-8db5-a5378267bb09 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.267176] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12c15675-c4d4-4ca4-b202-d337f5b2882d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.278033] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b066357a-ab8f-42cc-b5db-f40f4c8c6953 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.309652] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-db33ed6a-5ff9-4cc3-a536-f0234d85d6b1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.318432] env[68571]: DEBUG oslo_vmware.api [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Task: {'id': task-3467678, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.081664} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1188.318952] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1188.319156] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1188.319331] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1188.319506] env[68571]: INFO nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1188.320994] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-08134fd7-27cd-4635-bae4-7f0b0a41700e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.323309] env[68571]: DEBUG nova.compute.claims [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1188.323503] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1188.323719] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1188.345706] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1188.555660] env[68571]: DEBUG oslo_vmware.rw_handles [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1188.620776] env[68571]: DEBUG oslo_vmware.rw_handles [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1188.620964] env[68571]: DEBUG oslo_vmware.rw_handles [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1188.790544] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1fd5f6e9-10ab-4973-b562-b68138a698ab {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.798127] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e1c70c94-1613-480e-a81d-21f3a245b0f4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.829213] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83c15c14-f4fe-4c52-aae3-e792c830b037 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.836802] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc7cd5be-0a69-4e51-9f76-9dab9f7734ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1188.849897] env[68571]: DEBUG nova.compute.provider_tree [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1188.860111] env[68571]: DEBUG nova.scheduler.client.report [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1188.876584] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.553s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1188.877131] env[68571]: ERROR nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1188.877131] env[68571]: Faults: ['InvalidArgument'] [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.driver.spawn(context, instance, image_meta, [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._fetch_image_if_missing(context, vi) [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] image_cache(vi, tmp_image_ds_loc) [ 1188.877131] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] vm_util.copy_virtual_disk( [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] session._wait_for_task(vmdk_copy_task) [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.wait_for_task(task_ref) [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return evt.wait() [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = hub.switch() [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.greenlet.switch() [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1188.877650] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.f(*self.args, **self.kw) [ 1188.878018] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1188.878018] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exceptions.translate_fault(task_info.error) [ 1188.878018] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1188.878018] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Faults: ['InvalidArgument'] [ 1188.878018] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1188.878018] env[68571]: DEBUG nova.compute.utils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1188.879298] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Build of instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c was re-scheduled: A specified parameter was not correct: fileType [ 1188.879298] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1188.879661] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1188.879833] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1188.879986] env[68571]: DEBUG nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1188.880157] env[68571]: DEBUG nova.network.neutron [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1189.026367] env[68571]: DEBUG neutronclient.v2_0.client [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1189.028421] env[68571]: ERROR nova.compute.manager [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.driver.spawn(context, instance, image_meta, [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._fetch_image_if_missing(context, vi) [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] image_cache(vi, tmp_image_ds_loc) [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1189.028421] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] vm_util.copy_virtual_disk( [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] session._wait_for_task(vmdk_copy_task) [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.wait_for_task(task_ref) [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return evt.wait() [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = hub.switch() [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.greenlet.switch() [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.f(*self.args, **self.kw) [ 1189.028778] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exceptions.translate_fault(task_info.error) [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Faults: ['InvalidArgument'] [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] During handling of the above exception, another exception occurred: [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._build_and_run_instance(context, instance, image, [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exception.RescheduledException( [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] nova.exception.RescheduledException: Build of instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c was re-scheduled: A specified parameter was not correct: fileType [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Faults: ['InvalidArgument'] [ 1189.029135] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] During handling of the above exception, another exception occurred: [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] exception_handler_v20(status_code, error_body) [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise client_exc(message=error_message, [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Neutron server returns request_ids: ['req-8397b3d8-f727-4f0b-bbff-0a0e7cb8bdea'] [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029568] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] During handling of the above exception, another exception occurred: [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._deallocate_network(context, instance, requested_networks) [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.network_api.deallocate_for_instance( [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] data = neutron.list_ports(**search_opts) [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.list('ports', self.ports_path, retrieve_all, [ 1189.029916] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] for r in self._pagination(collection, path, **params): [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] res = self.get(path, params=params) [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.retry_request("GET", action, body=body, [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1189.030272] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.do_request(method, action, body=body, [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._handle_fault_response(status_code, replybody, resp) [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exception.Unauthorized() [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] nova.exception.Unauthorized: Not authorized. [ 1189.030647] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.082051] env[68571]: INFO nova.scheduler.client.report [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Deleted allocations for instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c [ 1189.113288] env[68571]: DEBUG oslo_concurrency.lockutils [None req-49f4722c-43a8-4cae-83c9-6b89666a416e tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 615.394s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.114517] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 418.584s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1189.114767] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Acquiring lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1189.114929] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1189.115125] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.118237] env[68571]: INFO nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Terminating instance [ 1189.119863] env[68571]: DEBUG nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1189.120059] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1189.120316] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a1337c0f-f89b-428f-9d71-02f1c9e6667b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.130030] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2414cb3-fd04-4bfe-9d16-fcf638e74eb7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1189.141213] env[68571]: DEBUG nova.compute.manager [None req-b1ee3429-661c-409b-8e03-a2631602de55 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: 14dee505-e30a-4395-9fe3-fb505492c4df] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.166750] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3adaf481-5844-45ac-8dc9-eb396a47ed1c could not be found. [ 1189.167261] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1189.167261] env[68571]: INFO nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1189.167762] env[68571]: DEBUG oslo.service.loopingcall [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1189.167958] env[68571]: DEBUG nova.compute.manager [None req-b1ee3429-661c-409b-8e03-a2631602de55 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] [instance: 14dee505-e30a-4395-9fe3-fb505492c4df] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.169145] env[68571]: DEBUG nova.compute.manager [-] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1189.169879] env[68571]: DEBUG nova.network.neutron [-] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1189.195809] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b1ee3429-661c-409b-8e03-a2631602de55 tempest-SecurityGroupsTestJSON-1310948525 tempest-SecurityGroupsTestJSON-1310948525-project-member] Lock "14dee505-e30a-4395-9fe3-fb505492c4df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 223.820s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.208919] env[68571]: DEBUG nova.compute.manager [None req-36bb7c0f-fa85-4230-86c5-862f959b0fc2 tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] [instance: 73f10282-d15a-4d6b-a0b9-5b3cb8764ff9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.249641] env[68571]: DEBUG nova.compute.manager [None req-36bb7c0f-fa85-4230-86c5-862f959b0fc2 tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] [instance: 73f10282-d15a-4d6b-a0b9-5b3cb8764ff9] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.289277] env[68571]: DEBUG oslo_concurrency.lockutils [None req-36bb7c0f-fa85-4230-86c5-862f959b0fc2 tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Lock "73f10282-d15a-4d6b-a0b9-5b3cb8764ff9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.207s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.304105] env[68571]: DEBUG nova.compute.manager [None req-b3a791b5-ff4e-43d4-809f-aed71cd29977 tempest-ServerMetadataNegativeTestJSON-5225460 tempest-ServerMetadataNegativeTestJSON-5225460-project-member] [instance: e9b8ab85-e972-4081-ae38-602a92fe3ab9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.310538] env[68571]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1189.310538] env[68571]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1189.310538] env[68571]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-4622512f-795b-45f7-b7cd-feb66ac98387'] [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1189.311361] env[68571]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1189.311856] env[68571]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1189.312872] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1189.312872] env[68571]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1189.312872] env[68571]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.312872] env[68571]: ERROR oslo.service.loopingcall [ 1189.312872] env[68571]: ERROR nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.325587] env[68571]: DEBUG nova.compute.manager [None req-b3a791b5-ff4e-43d4-809f-aed71cd29977 tempest-ServerMetadataNegativeTestJSON-5225460 tempest-ServerMetadataNegativeTestJSON-5225460-project-member] [instance: e9b8ab85-e972-4081-ae38-602a92fe3ab9] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.352576] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b3a791b5-ff4e-43d4-809f-aed71cd29977 tempest-ServerMetadataNegativeTestJSON-5225460 tempest-ServerMetadataNegativeTestJSON-5225460-project-member] Lock "e9b8ab85-e972-4081-ae38-602a92fe3ab9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 215.367s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.361034] env[68571]: ERROR nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] exception_handler_v20(status_code, error_body) [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise client_exc(message=error_message, [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.361034] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Neutron server returns request_ids: ['req-4622512f-795b-45f7-b7cd-feb66ac98387'] [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] During handling of the above exception, another exception occurred: [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Traceback (most recent call last): [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._delete_instance(context, instance, bdms) [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._shutdown_instance(context, instance, bdms) [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._try_deallocate_network(context, instance, requested_networks) [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] with excutils.save_and_reraise_exception(): [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.361446] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.force_reraise() [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise self.value [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] _deallocate_network_with_retries() [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return evt.wait() [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = hub.switch() [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.greenlet.switch() [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1189.361890] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = func(*self.args, **self.kw) [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] result = f(*args, **kwargs) [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._deallocate_network( [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self.network_api.deallocate_for_instance( [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] data = neutron.list_ports(**search_opts) [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.list('ports', self.ports_path, retrieve_all, [ 1189.362360] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] for r in self._pagination(collection, path, **params): [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] res = self.get(path, params=params) [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.retry_request("GET", action, body=body, [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1189.362776] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] return self.do_request(method, action, body=body, [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] ret = obj(*args, **kwargs) [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] self._handle_fault_response(status_code, replybody, resp) [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.363206] env[68571]: ERROR nova.compute.manager [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] [ 1189.370512] env[68571]: DEBUG nova.compute.manager [None req-1af6047b-ff48-44e5-93a8-da896cca9e6b tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.392018] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.276s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.392018] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 74.888s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1189.392018] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1189.392018] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "3adaf481-5844-45ac-8dc9-eb396a47ed1c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.398854] env[68571]: DEBUG nova.compute.manager [None req-1af6047b-ff48-44e5-93a8-da896cca9e6b tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: 061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.424316] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1af6047b-ff48-44e5-93a8-da896cca9e6b tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "061ea5d6-5470-4d7d-9ab1-ae5e606dd9cd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 207.835s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.435024] env[68571]: DEBUG nova.compute.manager [None req-e16fb427-6ffc-4a9b-bcb0-b5513bc1c992 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] [instance: 7a137e14-98ec-4718-8ff4-3700d2ef7ee9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.457161] env[68571]: INFO nova.compute.manager [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] [instance: 3adaf481-5844-45ac-8dc9-eb396a47ed1c] Successfully reverted task state from None on failure for instance. [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server [None req-7bd7f510-f607-4ce9-8d24-2cf096a6bfb4 tempest-ServerDiagnosticsNegativeTest-1029007653 tempest-ServerDiagnosticsNegativeTest-1029007653-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-4622512f-795b-45f7-b7cd-feb66ac98387'] [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1189.461079] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1189.461648] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1189.462236] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1189.462781] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1189.463384] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1189.463940] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1189.464612] env[68571]: ERROR oslo_messaging.rpc.server [ 1189.464612] env[68571]: DEBUG nova.compute.manager [None req-e16fb427-6ffc-4a9b-bcb0-b5513bc1c992 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] [instance: 7a137e14-98ec-4718-8ff4-3700d2ef7ee9] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.492116] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e16fb427-6ffc-4a9b-bcb0-b5513bc1c992 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Lock "7a137e14-98ec-4718-8ff4-3700d2ef7ee9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 206.517s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.500713] env[68571]: DEBUG nova.compute.manager [None req-39f896c9-3af0-41d8-980f-b379b470abaf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 9cace51b-100c-48d0-813c-eb31ec9384ec] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.526740] env[68571]: DEBUG nova.compute.manager [None req-39f896c9-3af0-41d8-980f-b379b470abaf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 9cace51b-100c-48d0-813c-eb31ec9384ec] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.548093] env[68571]: DEBUG oslo_concurrency.lockutils [None req-39f896c9-3af0-41d8-980f-b379b470abaf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "9cace51b-100c-48d0-813c-eb31ec9384ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 199.343s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.557714] env[68571]: DEBUG nova.compute.manager [None req-8579d8f0-61aa-41e4-a5b5-f9df996c1a62 tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] [instance: c8be0938-4b38-4e05-8afa-202d87a315b7] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.583713] env[68571]: DEBUG nova.compute.manager [None req-8579d8f0-61aa-41e4-a5b5-f9df996c1a62 tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] [instance: c8be0938-4b38-4e05-8afa-202d87a315b7] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1189.606241] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8579d8f0-61aa-41e4-a5b5-f9df996c1a62 tempest-VolumesAdminNegativeTest-600178275 tempest-VolumesAdminNegativeTest-600178275-project-member] Lock "c8be0938-4b38-4e05-8afa-202d87a315b7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 197.292s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1189.620021] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1189.675396] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1189.675672] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1189.677154] env[68571]: INFO nova.compute.claims [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1190.086834] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29e0eac5-32ef-4bb2-8bc6-4148734f6e7e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.094847] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeaccb9c-4f7e-46c4-8862-66213a80f0c9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.125144] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30ce6e67-9bf7-4197-b233-c53cb3eae196 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.132044] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a035fab-10ff-4e32-b35f-50b12750ebef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.144787] env[68571]: DEBUG nova.compute.provider_tree [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1190.153732] env[68571]: DEBUG nova.scheduler.client.report [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1190.168897] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.493s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1190.169397] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1190.215409] env[68571]: DEBUG nova.compute.utils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1190.217034] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Not allocating networking since 'none' was specified. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1190.225937] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1190.301369] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1190.329317] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1190.329566] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1190.329725] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1190.329905] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1190.330122] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1190.330295] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1190.330505] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1190.330667] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1190.330835] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1190.330994] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1190.331185] env[68571]: DEBUG nova.virt.hardware [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1190.332933] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-967289a8-1b43-4a05-af7f-14d22a26efd7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.340176] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a954ec89-a91b-49ac-9e05-e6b99bd3c95a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.353801] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance VIF info [] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1190.360583] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Creating folder: Project (83b58f008b5b4bec8f1aebf8690c6788). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1190.360886] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3fd91376-b1fc-4b24-9d6c-cd4d0fd807c9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.370852] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Created folder: Project (83b58f008b5b4bec8f1aebf8690c6788) in parent group-v692787. [ 1190.371052] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Creating folder: Instances. Parent ref: group-v692856. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1190.371289] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3cd48597-7693-41c5-84cd-dd66b3401c1b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.380981] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Created folder: Instances in parent group-v692856. [ 1190.381245] env[68571]: DEBUG oslo.service.loopingcall [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1190.381436] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1190.381645] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-6f05520c-fcc3-465c-b29e-cd29932fc4c5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.397830] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1190.397830] env[68571]: value = "task-3467681" [ 1190.397830] env[68571]: _type = "Task" [ 1190.397830] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1190.405054] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467681, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1190.911516] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467681, 'name': CreateVM_Task, 'duration_secs': 0.249349} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1190.911700] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1190.912137] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1190.912301] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1190.912692] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1190.912886] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c4cddeae-d6e8-475c-bdcb-16abed94ce79 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1190.918160] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for the task: (returnval){ [ 1190.918160] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52876955-7fa6-a382-2e3f-54d54369c0c1" [ 1190.918160] env[68571]: _type = "Task" [ 1190.918160] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1190.931014] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52876955-7fa6-a382-2e3f-54d54369c0c1, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1191.428502] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1191.428824] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1191.428949] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1199.315571] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1199.315866] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1199.416286] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1211.334398] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1211.334902] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1215.739122] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f7c6b8ad-9819-4120-9a74-961394b05463 tempest-AttachInterfacesUnderV243Test-157106225 tempest-AttachInterfacesUnderV243Test-157106225-project-member] Acquiring lock "36548949-5053-4f4c-a0ca-ac5487a6cf14" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1215.742313] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f7c6b8ad-9819-4120-9a74-961394b05463 tempest-AttachInterfacesUnderV243Test-157106225 tempest-AttachInterfacesUnderV243Test-157106225-project-member] Lock "36548949-5053-4f4c-a0ca-ac5487a6cf14" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.003s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1236.587154] env[68571]: WARNING oslo_vmware.rw_handles [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1236.587154] env[68571]: ERROR oslo_vmware.rw_handles [ 1236.587666] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1236.590159] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1236.590399] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Copying Virtual Disk [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/496be897-cdd8-446e-87ea-3f0f3ee99c21/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1236.590704] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-08417e7d-a4b7-4576-a4d6-b8cab572a5af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1236.599053] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for the task: (returnval){ [ 1236.599053] env[68571]: value = "task-3467682" [ 1236.599053] env[68571]: _type = "Task" [ 1236.599053] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1236.607578] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Task: {'id': task-3467682, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1237.109060] env[68571]: DEBUG oslo_vmware.exceptions [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1237.109163] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1237.109700] env[68571]: ERROR nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1237.109700] env[68571]: Faults: ['InvalidArgument'] [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Traceback (most recent call last): [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] yield resources [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self.driver.spawn(context, instance, image_meta, [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self._fetch_image_if_missing(context, vi) [ 1237.109700] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] image_cache(vi, tmp_image_ds_loc) [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] vm_util.copy_virtual_disk( [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] session._wait_for_task(vmdk_copy_task) [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return self.wait_for_task(task_ref) [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return evt.wait() [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] result = hub.switch() [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1237.109949] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return self.greenlet.switch() [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self.f(*self.args, **self.kw) [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] raise exceptions.translate_fault(task_info.error) [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Faults: ['InvalidArgument'] [ 1237.110222] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] [ 1237.110222] env[68571]: INFO nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Terminating instance [ 1237.111569] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1237.111772] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1237.112009] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b5bb0bf1-8f8f-47a5-b424-07fa92936216 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.114481] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1237.114666] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1237.115417] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0d4d49e-2ed7-4320-8951-c9bffd23c0da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.122503] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1237.123525] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b0f28d54-a498-4a54-ab83-ba1b04313e52 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.124954] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1237.125139] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1237.125792] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-00fc023a-f23e-4b3f-ae00-e15aba357c10 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.131107] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for the task: (returnval){ [ 1237.131107] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d486ca-f10e-a9e5-555b-ed5385a1c98b" [ 1237.131107] env[68571]: _type = "Task" [ 1237.131107] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1237.143741] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d486ca-f10e-a9e5-555b-ed5385a1c98b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1237.198219] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1237.198439] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1237.198618] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Deleting the datastore file [datastore1] c962c9c7-04a4-46ec-a46f-fac13caa6a1e {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1237.198882] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-5a4d27bc-4e7e-4f19-89df-cbba63a49180 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.204713] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for the task: (returnval){ [ 1237.204713] env[68571]: value = "task-3467684" [ 1237.204713] env[68571]: _type = "Task" [ 1237.204713] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1237.212063] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Task: {'id': task-3467684, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1237.489207] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1237.500660] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1237.500875] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.501085] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1237.501254] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1237.502359] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bed9b56-16e6-4355-b41d-15d6e0af9c18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.510987] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d736e4b6-379f-43a7-bced-bf31b272c231 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.524654] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f142b8b-d4e3-48ef-bfc9-3582ca6ce515 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.530828] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7e83e87a-8fa5-44db-bee2-7ebbaa587184 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.560384] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180906MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1237.560530] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1237.560717] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1237.641878] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642218] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642218] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642316] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642440] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642561] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642681] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642799] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.642915] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.643040] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1237.644239] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1237.644469] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Creating directory with path [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1237.644888] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c1146a5d-db0b-49e8-bb84-8b3578446be6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.654561] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.656943] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Created directory with path [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1237.657140] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Fetch image to [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1237.657309] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1237.658245] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f580d236-488d-4589-9180-69b4ff8cb298 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.665460] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1928deff-3923-4d7a-9a70-92dab03b0ff0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.668678] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a1253c3f-921b-4417-a8fb-22168474f9c1 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.677049] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ccee7dbb-a644-41a0-bbe3-33ea78edcad3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.681662] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b5c24d31-97f5-4b9b-a08e-4006a1d5d316 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.709545] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.715026] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0eefc84-c666-4613-889f-99691af51b25 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.720207] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.725546] env[68571]: DEBUG oslo_vmware.api [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Task: {'id': task-3467684, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061557} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1237.726209] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1237.726392] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1237.726560] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1237.726728] env[68571]: INFO nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1237.728098] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2170ad37-dd80-4e64-80b4-be52aec77c3f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1237.730039] env[68571]: DEBUG nova.compute.claims [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1237.730214] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1237.731299] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 67209cb0-7bb2-4aed-969a-e0d208fbf71b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.742096] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3cea970e-78f8-4b67-9350-65d3507f6b18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.749894] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1237.753102] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d62a50a6-fef2-42a8-a066-e36211c57f73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.766089] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f0b9847b-9438-4be7-a081-db33dd3ff998 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.778614] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b6a0771c-53cb-4503-bbc0-db992326b245 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.790503] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6532563b-5e91-409f-be05-084196087a4d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.820960] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 10b3cea3-b9d1-45b7-9ac8-b922952371ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.832901] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.842735] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.852870] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 36548949-5053-4f4c-a0ca-ac5487a6cf14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1237.853139] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1237.853290] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1237.963786] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1238.026496] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1238.026759] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1238.212237] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ee29e4a4-808d-4509-9ff6-58f505b5bde2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.220047] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20a30f9e-584b-4b7f-b7eb-5f3d8226452b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.248831] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e05b8717-6380-415a-9b06-c412f571d7d3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.256090] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-adff7d15-c90e-4aa9-937c-fce8ecb53310 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.270052] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1238.277895] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1238.291321] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1238.291513] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.731s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1238.291768] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.562s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1238.604310] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da69a4be-be6f-4fc8-9a14-4a3e8f706f31 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.612505] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-84658287-2a02-4d2f-815d-5709ea51f763 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.642068] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-da79a4c1-35df-4c77-b9bc-d74256ed81c8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.649316] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-924468b5-3c16-4a4f-84d5-81d9d0ed142e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1238.662199] env[68571]: DEBUG nova.compute.provider_tree [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1238.670979] env[68571]: DEBUG nova.scheduler.client.report [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1238.686916] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.395s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1238.687452] env[68571]: ERROR nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1238.687452] env[68571]: Faults: ['InvalidArgument'] [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Traceback (most recent call last): [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self.driver.spawn(context, instance, image_meta, [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self._fetch_image_if_missing(context, vi) [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] image_cache(vi, tmp_image_ds_loc) [ 1238.687452] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] vm_util.copy_virtual_disk( [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] session._wait_for_task(vmdk_copy_task) [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return self.wait_for_task(task_ref) [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return evt.wait() [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] result = hub.switch() [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] return self.greenlet.switch() [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1238.687752] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] self.f(*self.args, **self.kw) [ 1238.688154] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1238.688154] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] raise exceptions.translate_fault(task_info.error) [ 1238.688154] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1238.688154] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Faults: ['InvalidArgument'] [ 1238.688154] env[68571]: ERROR nova.compute.manager [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] [ 1238.688154] env[68571]: DEBUG nova.compute.utils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1238.689580] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Build of instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e was re-scheduled: A specified parameter was not correct: fileType [ 1238.689580] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1238.689950] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1238.690170] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1238.690364] env[68571]: DEBUG nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1238.690628] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1239.024166] env[68571]: DEBUG nova.network.neutron [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1239.035179] env[68571]: INFO nova.compute.manager [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Took 0.34 seconds to deallocate network for instance. [ 1239.126074] env[68571]: INFO nova.scheduler.client.report [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Deleted allocations for instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e [ 1239.149365] env[68571]: DEBUG oslo_concurrency.lockutils [None req-737b882e-dc8b-42b7-9622-8a62116d22cb tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 664.439s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.151013] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 467.123s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.151246] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Acquiring lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1239.151569] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.151753] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.153821] env[68571]: INFO nova.compute.manager [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Terminating instance [ 1239.156698] env[68571]: DEBUG nova.compute.manager [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1239.156895] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1239.157187] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-aa4f5c3a-0f3e-470e-bea2-92d344b23339 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.167039] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd309ed2-7876-45ec-912f-9ee7cf8349e0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.177962] env[68571]: DEBUG nova.compute.manager [None req-d4c87d72-4aff-4ed9-b707-dc0215a8c5b8 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: 2c21a8e5-da7f-4b3a-97ab-ec35f794edac] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.197690] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance c962c9c7-04a4-46ec-a46f-fac13caa6a1e could not be found. [ 1239.197898] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1239.198097] env[68571]: INFO nova.compute.manager [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1239.198349] env[68571]: DEBUG oslo.service.loopingcall [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1239.198570] env[68571]: DEBUG nova.compute.manager [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1239.198667] env[68571]: DEBUG nova.network.neutron [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1239.204411] env[68571]: DEBUG nova.compute.manager [None req-d4c87d72-4aff-4ed9-b707-dc0215a8c5b8 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: 2c21a8e5-da7f-4b3a-97ab-ec35f794edac] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.223331] env[68571]: DEBUG nova.network.neutron [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1239.234283] env[68571]: INFO nova.compute.manager [-] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] Took 0.04 seconds to deallocate network for instance. [ 1239.250485] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d4c87d72-4aff-4ed9-b707-dc0215a8c5b8 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "2c21a8e5-da7f-4b3a-97ab-ec35f794edac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 236.584s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.261138] env[68571]: DEBUG nova.compute.manager [None req-a1fd15ef-5912-4ad1-bb04-6276ffb5b1dc tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 5af733d9-dfa4-4059-8e33-1818695c8692] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.286520] env[68571]: DEBUG nova.compute.manager [None req-a1fd15ef-5912-4ad1-bb04-6276ffb5b1dc tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] [instance: 5af733d9-dfa4-4059-8e33-1818695c8692] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.307375] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a1fd15ef-5912-4ad1-bb04-6276ffb5b1dc tempest-AttachInterfacesTestJSON-2026169319 tempest-AttachInterfacesTestJSON-2026169319-project-member] Lock "5af733d9-dfa4-4059-8e33-1818695c8692" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 227.469s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.317378] env[68571]: DEBUG nova.compute.manager [None req-3782ec78-11e2-4f9a-9963-60dbdb163d8e tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: a333a6c9-5119-4d2f-81f3-cb86795ed364] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.348080] env[68571]: DEBUG nova.compute.manager [None req-3782ec78-11e2-4f9a-9963-60dbdb163d8e tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: a333a6c9-5119-4d2f-81f3-cb86795ed364] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.354917] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8952282e-1c7a-48d9-bc6f-b1d515ec9a27 tempest-VolumesAssistedSnapshotsTest-1706620063 tempest-VolumesAssistedSnapshotsTest-1706620063-project-member] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.355678] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 124.852s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.355849] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: c962c9c7-04a4-46ec-a46f-fac13caa6a1e] During sync_power_state the instance has a pending task (deleting). Skip. [ 1239.356028] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "c962c9c7-04a4-46ec-a46f-fac13caa6a1e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.368579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3782ec78-11e2-4f9a-9963-60dbdb163d8e tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "a333a6c9-5119-4d2f-81f3-cb86795ed364" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.849s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.376631] env[68571]: DEBUG nova.compute.manager [None req-5fda29f5-c038-4323-b6bc-258ade178d39 tempest-ServerGroupTestJSON-854930141 tempest-ServerGroupTestJSON-854930141-project-member] [instance: 1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.402032] env[68571]: DEBUG nova.compute.manager [None req-5fda29f5-c038-4323-b6bc-258ade178d39 tempest-ServerGroupTestJSON-854930141 tempest-ServerGroupTestJSON-854930141-project-member] [instance: 1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.423310] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5fda29f5-c038-4323-b6bc-258ade178d39 tempest-ServerGroupTestJSON-854930141 tempest-ServerGroupTestJSON-854930141-project-member] Lock "1cbb0e1a-ca70-4e0e-9adb-c4b62e80818b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.399s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.431591] env[68571]: DEBUG nova.compute.manager [None req-e1c3dab6-4afd-4ba5-a7d4-c16cfe2f13e8 tempest-ServerRescueTestJSONUnderV235-1698324520 tempest-ServerRescueTestJSONUnderV235-1698324520-project-member] [instance: df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.453479] env[68571]: DEBUG nova.compute.manager [None req-e1c3dab6-4afd-4ba5-a7d4-c16cfe2f13e8 tempest-ServerRescueTestJSONUnderV235-1698324520 tempest-ServerRescueTestJSONUnderV235-1698324520-project-member] [instance: df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.473037] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1c3dab6-4afd-4ba5-a7d4-c16cfe2f13e8 tempest-ServerRescueTestJSONUnderV235-1698324520 tempest-ServerRescueTestJSONUnderV235-1698324520-project-member] Lock "df5d4c12-01c8-46e2-b2a9-cf61a7d10e1a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.692s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.481160] env[68571]: DEBUG nova.compute.manager [None req-410d438e-4fda-4576-9b01-83c7e267ae75 tempest-ServersAaction247Test-231076223 tempest-ServersAaction247Test-231076223-project-member] [instance: 466d2eae-c109-4286-a223-edca73d6c8fa] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.503065] env[68571]: DEBUG nova.compute.manager [None req-410d438e-4fda-4576-9b01-83c7e267ae75 tempest-ServersAaction247Test-231076223 tempest-ServersAaction247Test-231076223-project-member] [instance: 466d2eae-c109-4286-a223-edca73d6c8fa] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1239.522453] env[68571]: DEBUG oslo_concurrency.lockutils [None req-410d438e-4fda-4576-9b01-83c7e267ae75 tempest-ServersAaction247Test-231076223 tempest-ServersAaction247Test-231076223-project-member] Lock "466d2eae-c109-4286-a223-edca73d6c8fa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.666s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.530646] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1239.575965] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1239.576258] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1239.577647] env[68571]: INFO nova.compute.claims [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1239.874608] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-92395602-3bac-4010-810e-56767f0324b5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.882827] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0b2774b-2583-4580-8671-4fc2ffbc5a6f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.912541] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-445737d5-f7f0-4b81-bca8-a5db2c36f15e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.919391] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-75460287-e59f-45eb-88ec-e9ea65a60217 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1239.932705] env[68571]: DEBUG nova.compute.provider_tree [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1239.941965] env[68571]: DEBUG nova.scheduler.client.report [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1239.956724] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.379s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1239.956724] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1239.986490] env[68571]: DEBUG nova.compute.utils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1239.987988] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1239.988417] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1239.999217] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1240.058819] env[68571]: DEBUG nova.policy [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e24bf82c22bd4085aa58ab6c7b51213b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5c2d5852c6a245a18f6932b3e21b2eb3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1240.072020] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1240.101083] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1240.101323] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1240.101473] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1240.101646] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1240.101785] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1240.102663] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1240.102663] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1240.102663] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1240.102663] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1240.102663] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1240.102814] env[68571]: DEBUG nova.virt.hardware [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1240.103782] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-535dd1c1-8daa-4f4d-aa01-2f3d427cb022 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.112755] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a63fc25-eac0-4b55-9ac2-589d2fdf3efd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1240.289611] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.289869] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.382594] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Successfully created port: 10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1240.489577] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.489815] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1240.992816] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Successfully updated port: 10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1241.009807] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1241.009967] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquired lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1241.010136] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1241.045952] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1241.211828] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Updating instance_info_cache with network_info: [{"id": "10de4455-e656-4819-a5fc-9fdc25d077d0", "address": "fa:16:3e:e4:db:5c", "network": {"id": "a1072c72-0873-45f9-918f-6f5ffb5273c1", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-999383875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2d5852c6a245a18f6932b3e21b2eb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ace50835-5731-4c77-b6c0-3076d7b4aa21", "external-id": "nsx-vlan-transportzone-270", "segmentation_id": 270, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10de4455-e6", "ovs_interfaceid": "10de4455-e656-4819-a5fc-9fdc25d077d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1241.227892] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Releasing lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1241.228363] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance network_info: |[{"id": "10de4455-e656-4819-a5fc-9fdc25d077d0", "address": "fa:16:3e:e4:db:5c", "network": {"id": "a1072c72-0873-45f9-918f-6f5ffb5273c1", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-999383875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2d5852c6a245a18f6932b3e21b2eb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ace50835-5731-4c77-b6c0-3076d7b4aa21", "external-id": "nsx-vlan-transportzone-270", "segmentation_id": 270, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10de4455-e6", "ovs_interfaceid": "10de4455-e656-4819-a5fc-9fdc25d077d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1241.229162] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:e4:db:5c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ace50835-5731-4c77-b6c0-3076d7b4aa21', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '10de4455-e656-4819-a5fc-9fdc25d077d0', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1241.238800] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Creating folder: Project (5c2d5852c6a245a18f6932b3e21b2eb3). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1241.239383] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b6cf9238-5708-4e54-9553-8551f84895ed {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1241.251799] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Created folder: Project (5c2d5852c6a245a18f6932b3e21b2eb3) in parent group-v692787. [ 1241.251799] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Creating folder: Instances. Parent ref: group-v692859. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1241.252043] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-2e743f8a-ecc8-4d4e-a454-c1a438c1e13c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1241.261944] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Created folder: Instances in parent group-v692859. [ 1241.262196] env[68571]: DEBUG oslo.service.loopingcall [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1241.262385] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1241.262588] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-99892914-e6a9-4f6a-83c8-bf7a9958417c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1241.283054] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1241.283054] env[68571]: value = "task-3467687" [ 1241.283054] env[68571]: _type = "Task" [ 1241.283054] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1241.288809] env[68571]: DEBUG nova.compute.manager [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Received event network-vif-plugged-10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1241.289031] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Acquiring lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1241.290454] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1241.290652] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1241.290871] env[68571]: DEBUG nova.compute.manager [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] No waiting events found dispatching network-vif-plugged-10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1241.290999] env[68571]: WARNING nova.compute.manager [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Received unexpected event network-vif-plugged-10de4455-e656-4819-a5fc-9fdc25d077d0 for instance with vm_state building and task_state spawning. [ 1241.291203] env[68571]: DEBUG nova.compute.manager [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Received event network-changed-10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1241.291376] env[68571]: DEBUG nova.compute.manager [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Refreshing instance network info cache due to event network-changed-10de4455-e656-4819-a5fc-9fdc25d077d0. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1241.291566] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Acquiring lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1241.291700] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Acquired lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1241.291864] env[68571]: DEBUG nova.network.neutron [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Refreshing network info cache for port 10de4455-e656-4819-a5fc-9fdc25d077d0 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1241.296350] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467687, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1241.570618] env[68571]: DEBUG nova.network.neutron [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Updated VIF entry in instance network info cache for port 10de4455-e656-4819-a5fc-9fdc25d077d0. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1241.570995] env[68571]: DEBUG nova.network.neutron [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Updating instance_info_cache with network_info: [{"id": "10de4455-e656-4819-a5fc-9fdc25d077d0", "address": "fa:16:3e:e4:db:5c", "network": {"id": "a1072c72-0873-45f9-918f-6f5ffb5273c1", "bridge": "br-int", "label": "tempest-ListServersNegativeTestJSON-999383875-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5c2d5852c6a245a18f6932b3e21b2eb3", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ace50835-5731-4c77-b6c0-3076d7b4aa21", "external-id": "nsx-vlan-transportzone-270", "segmentation_id": 270, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap10de4455-e6", "ovs_interfaceid": "10de4455-e656-4819-a5fc-9fdc25d077d0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1241.580730] env[68571]: DEBUG oslo_concurrency.lockutils [req-8de78a17-f507-466b-8511-e79e6888f1ab req-b362d4e4-8eeb-4881-a2ea-c53b3faaa8a2 service nova] Releasing lock "refresh_cache-e025f82d-a6a8-4dd4-b891-872f4b2fa176" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1241.792944] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467687, 'name': CreateVM_Task, 'duration_secs': 0.376357} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1241.793095] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1241.799904] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1241.800103] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1241.800414] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1241.800650] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f9de73b2-1e8a-42cd-b155-e1b3c0725636 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1241.804985] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for the task: (returnval){ [ 1241.804985] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52151c7e-ec4b-1a5a-70a3-e9e9383da961" [ 1241.804985] env[68571]: _type = "Task" [ 1241.804985] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1241.814420] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52151c7e-ec4b-1a5a-70a3-e9e9383da961, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1242.316599] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1242.316984] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1242.317069] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1242.490341] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1242.490572] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1242.490756] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1242.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517032] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517032] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517032] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517032] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517032] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1242.517305] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1243.489204] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.489204] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1243.489204] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1244.485481] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1245.489157] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1251.097897] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.098174] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1251.321262] env[68571]: DEBUG oslo_concurrency.lockutils [None req-16366ddb-2298-4c09-8782-8d1a9c4ad86a tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "4f0cfa21-d717-494c-8201-2c85dd11e512" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1251.321544] env[68571]: DEBUG oslo_concurrency.lockutils [None req-16366ddb-2298-4c09-8782-8d1a9c4ad86a tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "4f0cfa21-d717-494c-8201-2c85dd11e512" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1258.274895] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1284.266173] env[68571]: WARNING oslo_vmware.rw_handles [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1284.266173] env[68571]: ERROR oslo_vmware.rw_handles [ 1284.266719] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1284.269190] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1284.269457] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Copying Virtual Disk [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/487570b4-b2c8-49a9-bcd8-f751be9bb14e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1284.269775] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-6e1f2dbf-fc6f-4f26-8c7e-b6db7e33310d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.278566] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for the task: (returnval){ [ 1284.278566] env[68571]: value = "task-3467688" [ 1284.278566] env[68571]: _type = "Task" [ 1284.278566] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1284.287178] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Task: {'id': task-3467688, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1284.789437] env[68571]: DEBUG oslo_vmware.exceptions [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1284.789749] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1284.790337] env[68571]: ERROR nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1284.790337] env[68571]: Faults: ['InvalidArgument'] [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Traceback (most recent call last): [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] yield resources [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self.driver.spawn(context, instance, image_meta, [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self._fetch_image_if_missing(context, vi) [ 1284.790337] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] image_cache(vi, tmp_image_ds_loc) [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] vm_util.copy_virtual_disk( [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] session._wait_for_task(vmdk_copy_task) [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return self.wait_for_task(task_ref) [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return evt.wait() [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] result = hub.switch() [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1284.790731] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return self.greenlet.switch() [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self.f(*self.args, **self.kw) [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] raise exceptions.translate_fault(task_info.error) [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Faults: ['InvalidArgument'] [ 1284.791118] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] [ 1284.791118] env[68571]: INFO nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Terminating instance [ 1284.792269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1284.792487] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1284.792735] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-41464d83-563f-4adf-8166-39c616dc07bc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.794996] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1284.795208] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1284.795964] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4872abb9-4b69-4b81-9646-ee2d77f6d9c4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.802973] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1284.803218] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-2e21649a-bb1c-447c-8abd-fddc7fdaafa9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.805402] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1284.805590] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1284.806526] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-80ef22b6-89cd-4cf6-bb3c-1145966f139c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.811028] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for the task: (returnval){ [ 1284.811028] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]521b0952-1616-5cdb-fe54-0027cdb2c517" [ 1284.811028] env[68571]: _type = "Task" [ 1284.811028] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1284.819083] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]521b0952-1616-5cdb-fe54-0027cdb2c517, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1284.869472] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1284.869636] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1284.869815] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Deleting the datastore file [datastore1] 25f17a16-f752-4927-a2a5-73f1f18e5c8c {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1284.870090] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-f75a08a3-3240-4c81-8fa3-a0b553c6e4e0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1284.876356] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for the task: (returnval){ [ 1284.876356] env[68571]: value = "task-3467690" [ 1284.876356] env[68571]: _type = "Task" [ 1284.876356] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1284.883449] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Task: {'id': task-3467690, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1285.322026] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1285.322314] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Creating directory with path [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1285.322394] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bf0d3bbf-3e55-4fdf-b9e7-da593b0c47d1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.334635] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Created directory with path [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1285.334763] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Fetch image to [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1285.335074] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1285.335678] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22138d49-5d34-44a5-b8e5-b8b153b4ef60 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.342192] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c34f114d-640c-4f22-96f3-84ff40e0e23e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.355018] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1824fc09-476e-47a9-8ff1-335d639cecf0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.386350] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d80008d3-96ec-4349-810c-150958a9e6d5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.393393] env[68571]: DEBUG oslo_vmware.api [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Task: {'id': task-3467690, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072637} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1285.395152] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1285.395352] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1285.395553] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1285.395731] env[68571]: INFO nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1285.397509] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-18f04a6a-7c6a-4dbe-b729-f6f1cda4d4fe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.399511] env[68571]: DEBUG nova.compute.claims [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1285.399657] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1285.399878] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1285.426015] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1285.550985] env[68571]: DEBUG oslo_vmware.rw_handles [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1285.610107] env[68571]: DEBUG oslo_vmware.rw_handles [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1285.610303] env[68571]: DEBUG oslo_vmware.rw_handles [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1285.787105] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1264534-24e4-4735-b3b6-bad44b87128a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.794354] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-059bd039-4c1c-42e9-a64a-f09228c234a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.825180] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-364c3525-245c-498f-a304-829a69a9b874 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.832449] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d36f7afc-0190-4b66-9f50-0a452b3613cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1285.845273] env[68571]: DEBUG nova.compute.provider_tree [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1285.853897] env[68571]: DEBUG nova.scheduler.client.report [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1285.867286] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.467s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1285.867828] env[68571]: ERROR nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1285.867828] env[68571]: Faults: ['InvalidArgument'] [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Traceback (most recent call last): [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self.driver.spawn(context, instance, image_meta, [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self._fetch_image_if_missing(context, vi) [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] image_cache(vi, tmp_image_ds_loc) [ 1285.867828] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] vm_util.copy_virtual_disk( [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] session._wait_for_task(vmdk_copy_task) [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return self.wait_for_task(task_ref) [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return evt.wait() [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] result = hub.switch() [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] return self.greenlet.switch() [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1285.868414] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] self.f(*self.args, **self.kw) [ 1285.868817] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1285.868817] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] raise exceptions.translate_fault(task_info.error) [ 1285.868817] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1285.868817] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Faults: ['InvalidArgument'] [ 1285.868817] env[68571]: ERROR nova.compute.manager [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] [ 1285.868817] env[68571]: DEBUG nova.compute.utils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1285.869968] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Build of instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c was re-scheduled: A specified parameter was not correct: fileType [ 1285.869968] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1285.870354] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1285.870527] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1285.870700] env[68571]: DEBUG nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1285.870861] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1286.180124] env[68571]: DEBUG nova.network.neutron [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.189966] env[68571]: INFO nova.compute.manager [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Took 0.32 seconds to deallocate network for instance. [ 1286.298270] env[68571]: INFO nova.scheduler.client.report [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Deleted allocations for instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c [ 1286.323512] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e4c4f3b6-5aff-4ea0-b560-a0e1b18a82b8 tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 681.708s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.324591] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 483.235s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.325128] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1286.325128] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.325269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.327997] env[68571]: INFO nova.compute.manager [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Terminating instance [ 1286.329590] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquiring lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1286.329740] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Acquired lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1286.329903] env[68571]: DEBUG nova.network.neutron [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1286.339171] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: a1253c3f-921b-4417-a8fb-22168474f9c1] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1286.360385] env[68571]: DEBUG nova.network.neutron [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1286.365108] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: a1253c3f-921b-4417-a8fb-22168474f9c1] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1286.388580] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "a1253c3f-921b-4417-a8fb-22168474f9c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.946s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.404124] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: b5c24d31-97f5-4b9b-a08e-4006a1d5d316] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1286.429196] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: b5c24d31-97f5-4b9b-a08e-4006a1d5d316] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1286.454143] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "b5c24d31-97f5-4b9b-a08e-4006a1d5d316" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 224.987s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.464213] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1286.511282] env[68571]: DEBUG nova.network.neutron [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.518851] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1286.518851] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.520480] env[68571]: INFO nova.compute.claims [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1286.523396] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Releasing lock "refresh_cache-25f17a16-f752-4927-a2a5-73f1f18e5c8c" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1286.523774] env[68571]: DEBUG nova.compute.manager [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1286.523956] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1286.524966] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d1ded682-b1cd-4af5-83f1-75b7912c21fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.534756] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3e23819-a5ab-4015-a10f-1db5947ad4ed {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.564674] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 25f17a16-f752-4927-a2a5-73f1f18e5c8c could not be found. [ 1286.564990] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1286.565255] env[68571]: INFO nova.compute.manager [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1286.565572] env[68571]: DEBUG oslo.service.loopingcall [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1286.568734] env[68571]: DEBUG nova.compute.manager [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1286.568734] env[68571]: DEBUG nova.network.neutron [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1286.587073] env[68571]: DEBUG nova.network.neutron [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1286.594119] env[68571]: DEBUG nova.network.neutron [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1286.604507] env[68571]: INFO nova.compute.manager [-] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] Took 0.04 seconds to deallocate network for instance. [ 1286.693469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64f40713-e892-486c-9ab0-8cf069fb217b tempest-InstanceActionsNegativeTestJSON-1280557668 tempest-InstanceActionsNegativeTestJSON-1280557668-project-member] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.369s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.695096] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 172.191s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1286.695393] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 25f17a16-f752-4927-a2a5-73f1f18e5c8c] During sync_power_state the instance has a pending task (deleting). Skip. [ 1286.695650] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "25f17a16-f752-4927-a2a5-73f1f18e5c8c" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.851419] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0179b18-7ac6-4952-a480-f4861c041061 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.858843] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2f223eb-a0e9-463f-8751-2bf92ec69237 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.887381] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bf27d8a5-b612-48d1-82c8-8c5e96d3e47e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.893929] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7def481-c934-4347-80c1-98e249cbfc9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1286.906338] env[68571]: DEBUG nova.compute.provider_tree [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1286.915478] env[68571]: DEBUG nova.scheduler.client.report [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1286.930469] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.412s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1286.930946] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1286.962631] env[68571]: DEBUG nova.compute.utils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1286.964030] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1286.964208] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1286.972058] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1287.018551] env[68571]: DEBUG nova.policy [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '75464428d107469f99f4308cfdb6b2df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '506bd7cf3d9c4c54aabe7ef0be376fe9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1287.035102] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1287.062665] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1287.062921] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1287.063089] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1287.063272] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1287.063414] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1287.063556] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1287.063760] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1287.063915] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1287.064094] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1287.064261] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1287.064447] env[68571]: DEBUG nova.virt.hardware [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1287.065310] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ac7a8a97-9c28-427a-bc4d-eac9149e3a39 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.072984] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58b21592-0cae-4d92-8607-56fdd393773b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1287.357783] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Successfully created port: 73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1287.972896] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Successfully updated port: 73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1287.990639] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1287.990736] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1287.990858] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1288.028426] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1288.355631] env[68571]: DEBUG nova.compute.manager [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Received event network-vif-plugged-73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1288.355860] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Acquiring lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1288.356166] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1288.356246] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1288.356412] env[68571]: DEBUG nova.compute.manager [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] No waiting events found dispatching network-vif-plugged-73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1288.356577] env[68571]: WARNING nova.compute.manager [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Received unexpected event network-vif-plugged-73f687e6-c288-45a6-b340-4addce1093a4 for instance with vm_state building and task_state spawning. [ 1288.356769] env[68571]: DEBUG nova.compute.manager [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Received event network-changed-73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1288.356938] env[68571]: DEBUG nova.compute.manager [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Refreshing instance network info cache due to event network-changed-73f687e6-c288-45a6-b340-4addce1093a4. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1288.357244] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Acquiring lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1288.398854] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Updating instance_info_cache with network_info: [{"id": "73f687e6-c288-45a6-b340-4addce1093a4", "address": "fa:16:3e:0d:86:d0", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73f687e6-c2", "ovs_interfaceid": "73f687e6-c288-45a6-b340-4addce1093a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1288.411448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1288.411769] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance network_info: |[{"id": "73f687e6-c288-45a6-b340-4addce1093a4", "address": "fa:16:3e:0d:86:d0", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73f687e6-c2", "ovs_interfaceid": "73f687e6-c288-45a6-b340-4addce1093a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1288.412090] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Acquired lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1288.412274] env[68571]: DEBUG nova.network.neutron [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Refreshing network info cache for port 73f687e6-c288-45a6-b340-4addce1093a4 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1288.413415] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0d:86:d0', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '73f687e6-c288-45a6-b340-4addce1093a4', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1288.422264] env[68571]: DEBUG oslo.service.loopingcall [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1288.423635] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1288.426054] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-12aad1a8-5510-4495-8183-1a1d4e41b7ff {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1288.449374] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1288.449374] env[68571]: value = "task-3467691" [ 1288.449374] env[68571]: _type = "Task" [ 1288.449374] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1288.457432] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467691, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1288.759938] env[68571]: DEBUG nova.network.neutron [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Updated VIF entry in instance network info cache for port 73f687e6-c288-45a6-b340-4addce1093a4. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1288.760331] env[68571]: DEBUG nova.network.neutron [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Updating instance_info_cache with network_info: [{"id": "73f687e6-c288-45a6-b340-4addce1093a4", "address": "fa:16:3e:0d:86:d0", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap73f687e6-c2", "ovs_interfaceid": "73f687e6-c288-45a6-b340-4addce1093a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1288.770797] env[68571]: DEBUG oslo_concurrency.lockutils [req-c8d2de51-c4ff-4f46-b8cc-f2f9886227eb req-8984a5a3-0d4c-4cce-a54b-77e7013fa3cd service nova] Releasing lock "refresh_cache-a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1288.961042] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467691, 'name': CreateVM_Task, 'duration_secs': 0.29556} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1288.961042] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1288.961042] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1288.961287] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1288.961415] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1288.961669] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f7d138c8-bad5-4193-8c9c-49c9b524c803 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1288.966086] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1288.966086] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5281aaeb-2cf6-824c-052e-1a8ab7aa71d7" [ 1288.966086] env[68571]: _type = "Task" [ 1288.966086] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1288.973195] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5281aaeb-2cf6-824c-052e-1a8ab7aa71d7, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1289.476795] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1289.477418] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1289.477418] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1297.489737] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1297.501920] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1297.502158] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.502330] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1297.502485] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1297.505433] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-639454f9-8734-41c1-a446-379b67d78e63 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.513689] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a084f32-cc71-4d3e-a45b-9d5429c59473 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.527138] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-071420b3-5021-4f6e-9108-c7f3a08897b2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.533075] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-926bccde-6f6a-434f-a281-0e76528d9bd4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1297.560867] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180909MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1297.561014] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1297.561209] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1297.640400] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 244ba708-279e-440e-bc18-8c6ee7b83250 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.640552] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.640684] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.640810] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.640928] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.641060] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.641182] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.641294] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.641405] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.641516] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1297.652560] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.662757] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 67209cb0-7bb2-4aed-969a-e0d208fbf71b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.672701] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3cea970e-78f8-4b67-9350-65d3507f6b18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.683335] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d62a50a6-fef2-42a8-a066-e36211c57f73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.693177] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f0b9847b-9438-4be7-a081-db33dd3ff998 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.702538] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b6a0771c-53cb-4503-bbc0-db992326b245 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.711937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6532563b-5e91-409f-be05-084196087a4d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.721237] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 10b3cea3-b9d1-45b7-9ac8-b922952371ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.730613] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.739748] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.749423] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 36548949-5053-4f4c-a0ca-ac5487a6cf14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.759667] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.768470] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4f0cfa21-d717-494c-8201-2c85dd11e512 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1297.768688] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1297.768829] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1298.003500] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b807e27b-062b-4c7c-a490-f718e847291f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.011319] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9da6d5b1-b12e-4e37-894e-1680e9143c41 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.041401] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6de25b4-66e7-4080-902a-ed2a2bfdb1a1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.048331] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f6c0a405-e11b-456c-84c0-c55405bd428a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1298.060851] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1298.069160] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1298.083289] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1298.083464] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.522s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1301.078064] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1301.489521] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1301.489743] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1302.489883] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.490578] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1303.490907] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1303.490907] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1303.513673] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.513886] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514041] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514174] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514299] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514422] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514541] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.514839] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.515017] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.515155] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1303.515285] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1304.489337] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.489588] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1305.489893] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1306.436532] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1306.490246] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1331.474029] env[68571]: WARNING oslo_vmware.rw_handles [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1331.474029] env[68571]: ERROR oslo_vmware.rw_handles [ 1331.474029] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1331.476069] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1331.476322] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Copying Virtual Disk [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/8a23b1f6-8888-46ce-8eb2-50f925528245/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1331.476618] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ab6fc9b2-707e-4328-bd20-c4d936ec5fbc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1331.484130] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for the task: (returnval){ [ 1331.484130] env[68571]: value = "task-3467692" [ 1331.484130] env[68571]: _type = "Task" [ 1331.484130] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1331.492157] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Task: {'id': task-3467692, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1331.994848] env[68571]: DEBUG oslo_vmware.exceptions [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1331.994848] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1331.995390] env[68571]: ERROR nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1331.995390] env[68571]: Faults: ['InvalidArgument'] [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Traceback (most recent call last): [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] yield resources [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self.driver.spawn(context, instance, image_meta, [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self._fetch_image_if_missing(context, vi) [ 1331.995390] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] image_cache(vi, tmp_image_ds_loc) [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] vm_util.copy_virtual_disk( [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] session._wait_for_task(vmdk_copy_task) [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return self.wait_for_task(task_ref) [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return evt.wait() [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] result = hub.switch() [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1331.996040] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return self.greenlet.switch() [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self.f(*self.args, **self.kw) [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] raise exceptions.translate_fault(task_info.error) [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Faults: ['InvalidArgument'] [ 1331.996727] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] [ 1331.996727] env[68571]: INFO nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Terminating instance [ 1331.997349] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1331.997555] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1331.997842] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-59ab6e8b-d8a3-46aa-977c-e0d1c195359a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.000514] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1332.000748] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1332.001505] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cab3457-442a-422b-bb9c-360ba5b60da9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.008365] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1332.008598] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-35925d97-3302-44bc-84eb-4b0ea8e903cf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.010858] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1332.011044] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1332.012035] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-c9d2e82e-738b-48cd-99f8-d1cdbaf20434 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.016921] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1332.016921] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d0a05e-7068-deef-fec2-91cba3fedae8" [ 1332.016921] env[68571]: _type = "Task" [ 1332.016921] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.028671] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d0a05e-7068-deef-fec2-91cba3fedae8, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1332.528069] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1332.528069] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1332.528069] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0a811f12-3ec6-465a-8d05-1224e7767cdf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.540715] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1332.540936] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1332.541156] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Deleting the datastore file [datastore1] 244ba708-279e-440e-bc18-8c6ee7b83250 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1332.541458] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-994c75c9-2373-4394-915a-a8acd76a400d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.548032] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for the task: (returnval){ [ 1332.548032] env[68571]: value = "task-3467694" [ 1332.548032] env[68571]: _type = "Task" [ 1332.548032] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1332.552258] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1332.552447] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Fetch image to [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1332.552618] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1332.553707] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-27a98f3e-da99-4773-a4be-f444e2c72f72 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.558797] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Task: {'id': task-3467694, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1332.562810] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32eab010-a6b4-40f1-99de-bc6a88d5a675 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.571779] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37d04501-0f1e-4a48-b7bd-7ad5e2135e32 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.603257] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a8fa524-a033-42b4-a958-71294324e45c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.608996] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d291d8af-7221-442c-a4e9-7a1544c789ea {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1332.631787] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1332.681392] env[68571]: DEBUG oslo_vmware.rw_handles [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1332.740729] env[68571]: DEBUG oslo_vmware.rw_handles [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1332.740929] env[68571]: DEBUG oslo_vmware.rw_handles [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1333.058364] env[68571]: DEBUG oslo_vmware.api [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Task: {'id': task-3467694, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.076824} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1333.058609] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1333.058791] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1333.058986] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1333.059181] env[68571]: INFO nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Took 1.06 seconds to destroy the instance on the hypervisor. [ 1333.061227] env[68571]: DEBUG nova.compute.claims [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1333.061398] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1333.061607] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1333.378225] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00c20e8e-c9de-4a25-9884-a90b10abe4b3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.386122] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22076728-6ecc-487c-abda-301ea05b2d76 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.414665] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-660abdb4-b341-418c-830d-640a711f9393 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.421714] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57003779-42cf-44fd-9530-c9e34d62b7d5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.435540] env[68571]: DEBUG nova.compute.provider_tree [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1333.445771] env[68571]: DEBUG nova.scheduler.client.report [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1333.460128] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.398s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1333.461539] env[68571]: ERROR nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1333.461539] env[68571]: Faults: ['InvalidArgument'] [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Traceback (most recent call last): [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self.driver.spawn(context, instance, image_meta, [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self._fetch_image_if_missing(context, vi) [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] image_cache(vi, tmp_image_ds_loc) [ 1333.461539] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] vm_util.copy_virtual_disk( [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] session._wait_for_task(vmdk_copy_task) [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return self.wait_for_task(task_ref) [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return evt.wait() [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] result = hub.switch() [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] return self.greenlet.switch() [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1333.461877] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] self.f(*self.args, **self.kw) [ 1333.462452] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1333.462452] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] raise exceptions.translate_fault(task_info.error) [ 1333.462452] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1333.462452] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Faults: ['InvalidArgument'] [ 1333.462452] env[68571]: ERROR nova.compute.manager [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] [ 1333.462452] env[68571]: DEBUG nova.compute.utils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1333.463683] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Build of instance 244ba708-279e-440e-bc18-8c6ee7b83250 was re-scheduled: A specified parameter was not correct: fileType [ 1333.463683] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1333.464083] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1333.464262] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1333.464431] env[68571]: DEBUG nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1333.464589] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1333.792117] env[68571]: DEBUG nova.network.neutron [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1333.803027] env[68571]: INFO nova.compute.manager [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Took 0.34 seconds to deallocate network for instance. [ 1333.891634] env[68571]: INFO nova.scheduler.client.report [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Deleted allocations for instance 244ba708-279e-440e-bc18-8c6ee7b83250 [ 1333.915096] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f94f3506-1c7a-4263-b814-af9a82b7701c tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 671.812s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1333.916298] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 474.773s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1333.916572] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Acquiring lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1333.916787] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1333.916952] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1333.918883] env[68571]: INFO nova.compute.manager [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Terminating instance [ 1333.920471] env[68571]: DEBUG nova.compute.manager [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1333.920664] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1333.921164] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-bc0e5a13-bb46-4015-bb00-ce045e7f3872 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.926332] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1333.935183] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea04d2ba-6cda-4c7f-998f-bcf95113757c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1333.961670] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 244ba708-279e-440e-bc18-8c6ee7b83250 could not be found. [ 1333.961880] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1333.962068] env[68571]: INFO nova.compute.manager [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1333.962314] env[68571]: DEBUG oslo.service.loopingcall [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1333.964457] env[68571]: DEBUG nova.compute.manager [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1333.964558] env[68571]: DEBUG nova.network.neutron [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1333.977659] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1333.977889] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1333.979292] env[68571]: INFO nova.compute.claims [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1333.989726] env[68571]: DEBUG nova.network.neutron [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1333.997134] env[68571]: INFO nova.compute.manager [-] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] Took 0.03 seconds to deallocate network for instance. [ 1334.086928] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1630af07-5515-4c35-aa8d-4fac497b635a tempest-AttachInterfacesV270Test-831853663 tempest-AttachInterfacesV270Test-831853663-project-member] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.087760] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 219.583s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1334.087946] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 244ba708-279e-440e-bc18-8c6ee7b83250] During sync_power_state the instance has a pending task (deleting). Skip. [ 1334.088135] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "244ba708-279e-440e-bc18-8c6ee7b83250" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1334.271581] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29360b90-69d7-4314-87f5-3a3e8f3ad261 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1334.279266] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d10ce95-3695-414e-bb3b-b8a87eef2d4e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.004445] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-07566e4d-ed7b-4ce8-be89-fc6d43017e70 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.012307] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-36fbfd1a-4105-4ca6-b4cc-776b617e9f92 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.025198] env[68571]: DEBUG nova.compute.provider_tree [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1335.033849] env[68571]: DEBUG nova.scheduler.client.report [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1335.047404] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.069s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1335.047853] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1335.081768] env[68571]: DEBUG nova.compute.utils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1335.082984] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1335.083258] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1335.094797] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1335.151431] env[68571]: DEBUG nova.policy [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'be8a724337214e1b9634de8759ad057a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8c26c35b6924883bd76e2080a10614f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1335.177172] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1335.202209] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1335.202412] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1335.202570] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1335.202749] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1335.202893] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1335.203047] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1335.203381] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1335.203573] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1335.203745] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1335.203912] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1335.204094] env[68571]: DEBUG nova.virt.hardware [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1335.204970] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-77d45648-aa9c-4ca7-af56-c227fe8edbaf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.213516] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a6d09f2d-414b-4aac-858f-6009ada3227d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1335.455331] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Successfully created port: bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1336.011735] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Successfully updated port: bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1336.024718] env[68571]: DEBUG nova.compute.manager [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Received event network-vif-plugged-bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1336.025968] env[68571]: DEBUG oslo_concurrency.lockutils [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] Acquiring lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1336.025968] env[68571]: DEBUG oslo_concurrency.lockutils [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1336.025968] env[68571]: DEBUG oslo_concurrency.lockutils [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1336.025968] env[68571]: DEBUG nova.compute.manager [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] No waiting events found dispatching network-vif-plugged-bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1336.026174] env[68571]: WARNING nova.compute.manager [req-df57081d-e5eb-422d-b181-7f313e914a52 req-4139bcb2-2a9b-424e-8fc6-d387abe9aa56 service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Received unexpected event network-vif-plugged-bfff9523-3a95-43cd-bd59-01ae43c8f0a4 for instance with vm_state building and task_state spawning. [ 1336.028891] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1336.028953] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquired lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1336.029107] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1336.074777] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1336.239042] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Updating instance_info_cache with network_info: [{"id": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "address": "fa:16:3e:ba:88:9f", "network": {"id": "f6b0a8d8-c3ca-4b39-9537-aeeca49cd150", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1819267552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d8c26c35b6924883bd76e2080a10614f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "305ccd93-08cb-4658-845c-d9b64952daf7", "external-id": "nsx-vlan-transportzone-490", "segmentation_id": 490, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfff9523-3a", "ovs_interfaceid": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1336.252211] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Releasing lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1336.252513] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance network_info: |[{"id": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "address": "fa:16:3e:ba:88:9f", "network": {"id": "f6b0a8d8-c3ca-4b39-9537-aeeca49cd150", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1819267552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d8c26c35b6924883bd76e2080a10614f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "305ccd93-08cb-4658-845c-d9b64952daf7", "external-id": "nsx-vlan-transportzone-490", "segmentation_id": 490, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfff9523-3a", "ovs_interfaceid": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1336.252949] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:88:9f', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '305ccd93-08cb-4658-845c-d9b64952daf7', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'bfff9523-3a95-43cd-bd59-01ae43c8f0a4', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1336.260302] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Creating folder: Project (d8c26c35b6924883bd76e2080a10614f). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1336.260855] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-b31213d3-0f1a-44a9-85c6-cf4d20276856 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.271761] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Created folder: Project (d8c26c35b6924883bd76e2080a10614f) in parent group-v692787. [ 1336.271892] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Creating folder: Instances. Parent ref: group-v692863. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1336.272174] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f03a4b3c-7469-4d62-9c72-a8568197b16f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.281784] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Created folder: Instances in parent group-v692863. [ 1336.282014] env[68571]: DEBUG oslo.service.loopingcall [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1336.282199] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1336.282390] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-19b8ac81-9faf-4272-a92e-089f6189f2c0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.300953] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1336.300953] env[68571]: value = "task-3467697" [ 1336.300953] env[68571]: _type = "Task" [ 1336.300953] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1336.308218] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467697, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1336.811087] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467697, 'name': CreateVM_Task, 'duration_secs': 0.294332} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1336.811343] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1336.812145] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1336.812369] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1336.812756] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1336.813127] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9916b14d-68b3-47ca-95aa-1187a0305ed9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1336.817832] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for the task: (returnval){ [ 1336.817832] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]521e1994-59d6-ee3a-507d-26b047494b03" [ 1336.817832] env[68571]: _type = "Task" [ 1336.817832] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1336.827389] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]521e1994-59d6-ee3a-507d-26b047494b03, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1337.328429] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1337.328429] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1337.328429] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1338.053711] env[68571]: DEBUG nova.compute.manager [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Received event network-changed-bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1338.053912] env[68571]: DEBUG nova.compute.manager [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Refreshing instance network info cache due to event network-changed-bfff9523-3a95-43cd-bd59-01ae43c8f0a4. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1338.054157] env[68571]: DEBUG oslo_concurrency.lockutils [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] Acquiring lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1338.054304] env[68571]: DEBUG oslo_concurrency.lockutils [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] Acquired lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1338.054577] env[68571]: DEBUG nova.network.neutron [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Refreshing network info cache for port bfff9523-3a95-43cd-bd59-01ae43c8f0a4 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1338.329910] env[68571]: DEBUG nova.network.neutron [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Updated VIF entry in instance network info cache for port bfff9523-3a95-43cd-bd59-01ae43c8f0a4. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1338.330379] env[68571]: DEBUG nova.network.neutron [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Updating instance_info_cache with network_info: [{"id": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "address": "fa:16:3e:ba:88:9f", "network": {"id": "f6b0a8d8-c3ca-4b39-9537-aeeca49cd150", "bridge": "br-int", "label": "tempest-FloatingIPsAssociationTestJSON-1819267552-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "d8c26c35b6924883bd76e2080a10614f", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "305ccd93-08cb-4658-845c-d9b64952daf7", "external-id": "nsx-vlan-transportzone-490", "segmentation_id": 490, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapbfff9523-3a", "ovs_interfaceid": "bfff9523-3a95-43cd-bd59-01ae43c8f0a4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1338.341652] env[68571]: DEBUG oslo_concurrency.lockutils [req-38d3bd32-70f4-4bb1-969d-1ac8946e6a83 req-f12eb266-9440-461f-b567-c1fdefee814f service nova] Releasing lock "refresh_cache-56c7e368-4032-4028-83f0-58b0cd3b3cbd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1347.869560] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1356.948846] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "9e8c8d14-144f-42e3-8556-796651b7b04f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1356.949184] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.490192] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1359.503381] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1359.503579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.503748] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1359.503905] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1359.504997] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bfb2f945-ef3a-4716-bdca-1eb20fd056be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.513870] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a07430fd-ab6a-451e-a26c-5915f31eced2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.527485] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad6bfcbc-ca4c-4f63-9f06-323c14d2f832 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.533760] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dd365f3-4c63-423d-982f-a617e8cb0687 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1359.562189] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1359.562355] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1359.562556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1359.638704] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.638866] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b60eb700-434f-4bea-a84f-9071402001c3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.638992] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639131] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639274] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639444] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639490] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639603] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639717] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.639827] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1359.652369] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 67209cb0-7bb2-4aed-969a-e0d208fbf71b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.663245] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 3cea970e-78f8-4b67-9350-65d3507f6b18 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.672958] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d62a50a6-fef2-42a8-a066-e36211c57f73 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.683820] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f0b9847b-9438-4be7-a081-db33dd3ff998 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.695427] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b6a0771c-53cb-4503-bbc0-db992326b245 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.706024] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 6532563b-5e91-409f-be05-084196087a4d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.715817] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 10b3cea3-b9d1-45b7-9ac8-b922952371ba has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.725207] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.735057] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.744826] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 36548949-5053-4f4c-a0ca-ac5487a6cf14 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.753909] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.762822] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4f0cfa21-d717-494c-8201-2c85dd11e512 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.772092] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1359.772315] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1359.772528] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1360.028261] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b75a6a53-be37-447e-8a0f-e23645b115fc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.036209] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba3f58e3-68e1-42c8-9095-6ca4c56efbca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.064558] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f09dc804-0681-4ce7-94e7-cb330efc4efe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.071654] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c870a7f6-d0b9-4b96-a0e6-424d2c3c9b96 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1360.084864] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1360.093782] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1360.108929] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1360.109119] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.547s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1361.103407] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1361.489033] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1361.489312] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1362.489508] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1365.489960] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1365.490269] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1365.490269] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1365.514056] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514240] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514376] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514503] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514627] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514748] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514865] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.514981] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.515109] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.515224] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1365.515546] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1365.516108] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1365.516281] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1366.489431] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1366.489649] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.488431] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1368.781465] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1368.781703] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1375.751555] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1f7f3fcc-d403-4056-9967-103f75a9aec9 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "780d6657-20dc-4d8c-acec-0e002f79372b" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1375.751857] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1f7f3fcc-d403-4056-9967-103f75a9aec9 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "780d6657-20dc-4d8c-acec-0e002f79372b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1381.638977] env[68571]: WARNING oslo_vmware.rw_handles [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1381.638977] env[68571]: ERROR oslo_vmware.rw_handles [ 1381.639686] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1381.641609] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1381.641869] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Copying Virtual Disk [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/7673eb11-5b79-4177-bddf-4c3b6801a7f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1381.642161] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-bd62f448-546f-4b56-ba35-63e305116e7a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1381.650561] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1381.650561] env[68571]: value = "task-3467698" [ 1381.650561] env[68571]: _type = "Task" [ 1381.650561] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1381.658750] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467698, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1382.161776] env[68571]: DEBUG oslo_vmware.exceptions [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1382.162062] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1382.162609] env[68571]: ERROR nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1382.162609] env[68571]: Faults: ['InvalidArgument'] [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Traceback (most recent call last): [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] yield resources [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self.driver.spawn(context, instance, image_meta, [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self._fetch_image_if_missing(context, vi) [ 1382.162609] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] image_cache(vi, tmp_image_ds_loc) [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] vm_util.copy_virtual_disk( [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] session._wait_for_task(vmdk_copy_task) [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return self.wait_for_task(task_ref) [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return evt.wait() [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] result = hub.switch() [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1382.162938] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return self.greenlet.switch() [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self.f(*self.args, **self.kw) [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] raise exceptions.translate_fault(task_info.error) [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Faults: ['InvalidArgument'] [ 1382.163352] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] [ 1382.163352] env[68571]: INFO nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Terminating instance [ 1382.164640] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1382.164844] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1382.165507] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1382.165704] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1382.165933] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9569ef45-8c0f-4fb1-a3c9-7855a4331453 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.168266] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29ce7c21-fe5e-485b-8803-98f47ee4ab66 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.174741] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1382.174952] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-32c47bdf-714b-4476-9703-414c2f1f3d0b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.177136] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1382.177309] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1382.178319] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-baa5c0c6-0df2-4697-8e2b-7000071039bb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.182783] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 1382.182783] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]520f5662-cade-18df-cb2c-aede9f0eb10a" [ 1382.182783] env[68571]: _type = "Task" [ 1382.182783] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1382.191272] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]520f5662-cade-18df-cb2c-aede9f0eb10a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1382.245118] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1382.245350] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1382.245608] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleting the datastore file [datastore1] 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1382.245874] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e099a2de-50bd-4b05-b94e-53fa22e4c405 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.252207] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1382.252207] env[68571]: value = "task-3467700" [ 1382.252207] env[68571]: _type = "Task" [ 1382.252207] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1382.259711] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467700, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1382.692903] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1382.693232] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Creating directory with path [datastore1] vmware_temp/ee7623aa-7d13-4d97-ba92-aca7372e01f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1382.693408] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-96d87a80-e83e-439a-b4cc-c8c55d7a1119 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.705021] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Created directory with path [datastore1] vmware_temp/ee7623aa-7d13-4d97-ba92-aca7372e01f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1382.705220] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Fetch image to [datastore1] vmware_temp/ee7623aa-7d13-4d97-ba92-aca7372e01f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1382.705390] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/ee7623aa-7d13-4d97-ba92-aca7372e01f2/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1382.706178] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8fa2a03-314b-435f-a2ea-75e5701eafd9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.712706] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c92a15d4-ac76-4cd4-8b6e-f6a41c2f3fca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.721568] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f33b87b-86a5-48c2-8fc3-a20bd9bdce06 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.753022] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b009e623-9369-4766-a9de-60b09df77e98 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.764460] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-98ad3ac6-b23e-4274-88b9-88de66d2aafc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1382.766170] env[68571]: DEBUG oslo_vmware.api [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467700, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065567} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1382.766432] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1382.766628] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1382.766801] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1382.766976] env[68571]: INFO nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1382.769055] env[68571]: DEBUG nova.compute.claims [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1382.769209] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1382.769419] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1382.789308] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1382.996887] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1382.998468] env[68571]: ERROR nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1382.998468] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1382.998856] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] yield resources [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.driver.spawn(context, instance, image_meta, [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._fetch_image_if_missing(context, vi) [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image_fetch(context, vi, tmp_image_ds_loc) [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] images.fetch_image( [ 1382.999192] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] metadata = IMAGE_API.get(context, image_ref) [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return session.show(context, image_id, [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] _reraise_translated_image_exception(image_id) [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise new_exc.with_traceback(exc_trace) [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1382.999558] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1382.999925] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1383.000272] env[68571]: INFO nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Terminating instance [ 1383.000272] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1383.000733] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1383.002840] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ba2c6c6c-dc84-45c3-8a0a-33d80e4c2bab {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.005541] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1383.005738] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1383.006804] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ded51768-4620-4898-8e10-080bc21a4356 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.014837] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1383.015663] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-36118151-d69e-4868-9e67-5847e20a3f3b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.017123] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1383.017299] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1383.021581] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-99952145-8d1b-469b-b1dc-23991e34ddbd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.026790] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for the task: (returnval){ [ 1383.026790] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52adf149-ad2d-8ee5-abfb-e3175068d25a" [ 1383.026790] env[68571]: _type = "Task" [ 1383.026790] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1383.034122] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52adf149-ad2d-8ee5-abfb-e3175068d25a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1383.093031] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1383.093031] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1383.093031] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleting the datastore file [datastore1] b60eb700-434f-4bea-a84f-9071402001c3 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1383.095666] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-7a92ba6c-26a1-4369-be8b-1842eda12649 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.102158] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for the task: (returnval){ [ 1383.102158] env[68571]: value = "task-3467702" [ 1383.102158] env[68571]: _type = "Task" [ 1383.102158] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1383.111541] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': task-3467702, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1383.118678] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e072d8ea-c06a-4c7d-8ba0-7a8291928363 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.125018] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d54a23df-d7e8-4a08-94ff-a9f984e6099f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.155286] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a02fb25-5039-4e9c-8658-1f1e4373ea28 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.163105] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0d5d332f-8b14-4c66-a589-fb4eaf4fc66d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.176298] env[68571]: DEBUG nova.compute.provider_tree [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1383.188110] env[68571]: DEBUG nova.scheduler.client.report [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1383.208721] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.438s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1383.208721] env[68571]: ERROR nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1383.208721] env[68571]: Faults: ['InvalidArgument'] [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Traceback (most recent call last): [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self.driver.spawn(context, instance, image_meta, [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1383.208721] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self._fetch_image_if_missing(context, vi) [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] image_cache(vi, tmp_image_ds_loc) [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] vm_util.copy_virtual_disk( [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] session._wait_for_task(vmdk_copy_task) [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return self.wait_for_task(task_ref) [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return evt.wait() [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] result = hub.switch() [ 1383.209112] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] return self.greenlet.switch() [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] self.f(*self.args, **self.kw) [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] raise exceptions.translate_fault(task_info.error) [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Faults: ['InvalidArgument'] [ 1383.209516] env[68571]: ERROR nova.compute.manager [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] [ 1383.209516] env[68571]: DEBUG nova.compute.utils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1383.210415] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Build of instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 was re-scheduled: A specified parameter was not correct: fileType [ 1383.210415] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1383.210778] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1383.210950] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1383.211137] env[68571]: DEBUG nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1383.211301] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1383.537844] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1383.538144] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Creating directory with path [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1383.538376] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-d133593f-1b9c-480f-bec5-67587403fd4c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.549698] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Created directory with path [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1383.549906] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Fetch image to [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1383.551538] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1383.551538] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0fbf6c4-209f-41af-9818-491ef9eb0d3f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.562988] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05e4f193-0c0c-4d72-910c-121500373bf5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.575966] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa580e52-c8e7-43eb-ba40-40ee1cde8a06 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.611772] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7fca0863-e628-4b8c-936b-397bc664450c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.624407] env[68571]: DEBUG oslo_vmware.api [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Task: {'id': task-3467702, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.065093} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1383.624552] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-e8b83683-0dd1-4092-b7ba-7cf6ef4e107f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1383.626275] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1383.626487] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1383.626704] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1383.626883] env[68571]: INFO nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1383.629358] env[68571]: DEBUG nova.compute.claims [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1383.629533] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.629752] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.650053] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1383.745403] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1383.812385] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2cd6c9cb-67c2-48bc-bc05-f367eae44daf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "47511138-2486-46a8-85d5-081388bb0b16" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1383.812620] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2cd6c9cb-67c2-48bc-bc05-f367eae44daf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "47511138-2486-46a8-85d5-081388bb0b16" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1383.813237] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1383.813460] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1383.843020] env[68571]: DEBUG nova.network.neutron [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1383.862430] env[68571]: INFO nova.compute.manager [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Took 0.65 seconds to deallocate network for instance. [ 1384.022163] env[68571]: INFO nova.scheduler.client.report [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted allocations for instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 [ 1384.052768] env[68571]: DEBUG oslo_concurrency.lockutils [None req-93786b12-22a3-43e7-b125-b415367edcd9 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 666.881s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.054032] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 470.694s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.054370] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1384.054671] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.054906] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.057607] env[68571]: INFO nova.compute.manager [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Terminating instance [ 1384.059988] env[68571]: DEBUG nova.compute.manager [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1384.060223] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1384.060512] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-b3b5c45b-edd7-4bb1-a847-8d03eff908d5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.063774] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-348c3e1e-67f0-41f7-8181-12ffbc13288b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.072743] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82320931-d801-4106-9f50-49432ed9f48d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.078544] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fab7fe56-abc7-4695-885c-84cf7fa71e24 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.125371] env[68571]: DEBUG nova.compute.manager [None req-20ddc929-573a-4f9f-9533-91af9da978f0 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: 67209cb0-7bb2-4aed-969a-e0d208fbf71b] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.129031] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-26b91711-ef89-46a3-a72d-dbcdc5c536b5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.132092] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6 could not be found. [ 1384.132292] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1384.132471] env[68571]: INFO nova.compute.manager [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Took 0.07 seconds to destroy the instance on the hypervisor. [ 1384.132711] env[68571]: DEBUG oslo.service.loopingcall [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1384.132932] env[68571]: DEBUG nova.compute.manager [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1384.133044] env[68571]: DEBUG nova.network.neutron [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1384.139638] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a8cab060-0208-43e8-a3dd-0969b8dc8cf7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.154759] env[68571]: DEBUG nova.compute.provider_tree [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1384.164555] env[68571]: DEBUG nova.scheduler.client.report [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1384.168236] env[68571]: DEBUG nova.network.neutron [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1384.169467] env[68571]: DEBUG nova.compute.manager [None req-20ddc929-573a-4f9f-9533-91af9da978f0 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: 67209cb0-7bb2-4aed-969a-e0d208fbf71b] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.175682] env[68571]: INFO nova.compute.manager [-] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] Took 0.04 seconds to deallocate network for instance. [ 1384.181545] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.552s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.182237] env[68571]: ERROR nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1384.182237] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.182545] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.driver.spawn(context, instance, image_meta, [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._fetch_image_if_missing(context, vi) [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image_fetch(context, vi, tmp_image_ds_loc) [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] images.fetch_image( [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] metadata = IMAGE_API.get(context, image_ref) [ 1384.182813] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return session.show(context, image_id, [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] _reraise_translated_image_exception(image_id) [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise new_exc.with_traceback(exc_trace) [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1384.183131] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1384.183479] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.183479] env[68571]: DEBUG nova.compute.utils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1384.184564] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Build of instance b60eb700-434f-4bea-a84f-9071402001c3 was re-scheduled: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1384.185018] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1384.185195] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1384.185349] env[68571]: DEBUG nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1384.185534] env[68571]: DEBUG nova.network.neutron [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1384.199855] env[68571]: DEBUG oslo_concurrency.lockutils [None req-20ddc929-573a-4f9f-9533-91af9da978f0 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "67209cb0-7bb2-4aed-969a-e0d208fbf71b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 217.344s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.228732] env[68571]: DEBUG nova.compute.manager [None req-8effc918-eea1-41da-bdf4-ef4c5261e72c tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: 3cea970e-78f8-4b67-9350-65d3507f6b18] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.282381] env[68571]: DEBUG nova.compute.manager [None req-8effc918-eea1-41da-bdf4-ef4c5261e72c tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: 3cea970e-78f8-4b67-9350-65d3507f6b18] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.313015] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8effc918-eea1-41da-bdf4-ef4c5261e72c tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "3cea970e-78f8-4b67-9350-65d3507f6b18" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.968s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.374046] env[68571]: DEBUG nova.compute.manager [None req-19904ea2-dc4f-47ec-be0a-568e0a5e9077 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: d62a50a6-fef2-42a8-a066-e36211c57f73] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.406495] env[68571]: DEBUG neutronclient.v2_0.client [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1384.407502] env[68571]: ERROR nova.compute.manager [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1384.407502] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.407852] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.driver.spawn(context, instance, image_meta, [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._fetch_image_if_missing(context, vi) [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image_fetch(context, vi, tmp_image_ds_loc) [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] images.fetch_image( [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] metadata = IMAGE_API.get(context, image_ref) [ 1384.408279] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return session.show(context, image_id, [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] _reraise_translated_image_exception(image_id) [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise new_exc.with_traceback(exc_trace) [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = getattr(controller, method)(*args, **kwargs) [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._get(image_id) [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1384.408625] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] resp, body = self.http_client.get(url, headers=header) [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.request(url, 'GET', **kwargs) [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self._handle_response(resp) [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exc.from_response(resp, resp.content) [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.408973] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._build_and_run_instance(context, instance, image, [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exception.RescheduledException( [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.RescheduledException: Build of instance b60eb700-434f-4bea-a84f-9071402001c3 was re-scheduled: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1384.409334] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] exception_handler_v20(status_code, error_body) [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise client_exc(message=error_message, [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Neutron server returns request_ids: ['req-bd1e60b8-3853-4514-9cc0-8b2aeea7ddb1'] [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._deallocate_network(context, instance, requested_networks) [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.network_api.deallocate_for_instance( [ 1384.409742] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] data = neutron.list_ports(**search_opts) [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.list('ports', self.ports_path, retrieve_all, [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] for r in self._pagination(collection, path, **params): [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] res = self.get(path, params=params) [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.410094] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.retry_request("GET", action, body=body, [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.do_request(method, action, body=body, [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._handle_fault_response(status_code, replybody, resp) [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exception.Unauthorized() [ 1384.410450] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.Unauthorized: Not authorized. [ 1384.410857] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1384.429478] env[68571]: DEBUG nova.compute.manager [None req-19904ea2-dc4f-47ec-be0a-568e0a5e9077 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] [instance: d62a50a6-fef2-42a8-a066-e36211c57f73] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.433951] env[68571]: DEBUG oslo_concurrency.lockutils [None req-0306e316-ef0f-4c6f-b54f-d2915d9193ef tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.380s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.435255] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 269.931s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.435466] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6] During sync_power_state the instance has a pending task (deleting). Skip. [ 1384.435662] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "3495ea0a-f639-4ea3-a23b-b3ee8ffe07c6" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.458585] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19904ea2-dc4f-47ec-be0a-568e0a5e9077 tempest-ListServerFiltersTestJSON-1460936247 tempest-ListServerFiltersTestJSON-1460936247-project-member] Lock "d62a50a6-fef2-42a8-a066-e36211c57f73" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 216.565s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.485249] env[68571]: DEBUG nova.compute.manager [None req-43190948-579f-4dfb-98b2-3f76e5c36d5b tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] [instance: f0b9847b-9438-4be7-a081-db33dd3ff998] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.514879] env[68571]: DEBUG nova.compute.manager [None req-43190948-579f-4dfb-98b2-3f76e5c36d5b tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] [instance: f0b9847b-9438-4be7-a081-db33dd3ff998] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.526483] env[68571]: INFO nova.scheduler.client.report [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Deleted allocations for instance b60eb700-434f-4bea-a84f-9071402001c3 [ 1384.543743] env[68571]: DEBUG oslo_concurrency.lockutils [None req-43190948-579f-4dfb-98b2-3f76e5c36d5b tempest-AttachVolumeShelveTestJSON-1274929045 tempest-AttachVolumeShelveTestJSON-1274929045-project-member] Lock "f0b9847b-9438-4be7-a081-db33dd3ff998" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 212.245s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.545372] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ac11f071-d184-47e0-881b-45a4d3430b51 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.841s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.546664] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 434.010s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.546933] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "b60eb700-434f-4bea-a84f-9071402001c3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1384.547157] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.547326] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.550686] env[68571]: INFO nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Terminating instance [ 1384.552702] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquiring lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1384.552864] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Acquired lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1384.553045] env[68571]: DEBUG nova.network.neutron [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1384.577129] env[68571]: DEBUG nova.compute.manager [None req-ed43f1dd-787a-4c98-87dd-815ee2c472d6 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] [instance: 6532563b-5e91-409f-be05-084196087a4d] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.585062] env[68571]: DEBUG nova.compute.manager [None req-5a0d3411-b39d-4cd4-badf-063b65754298 tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: b6a0771c-53cb-4503-bbc0-db992326b245] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.617127] env[68571]: DEBUG nova.compute.manager [None req-ed43f1dd-787a-4c98-87dd-815ee2c472d6 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] [instance: 6532563b-5e91-409f-be05-084196087a4d] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.623974] env[68571]: DEBUG nova.compute.manager [None req-5a0d3411-b39d-4cd4-badf-063b65754298 tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] [instance: b6a0771c-53cb-4503-bbc0-db992326b245] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.651159] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ed43f1dd-787a-4c98-87dd-815ee2c472d6 tempest-ServersTestMultiNic-1790639670 tempest-ServersTestMultiNic-1790639670-project-member] Lock "6532563b-5e91-409f-be05-084196087a4d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 202.792s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.666582] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5a0d3411-b39d-4cd4-badf-063b65754298 tempest-ImagesTestJSON-1315536367 tempest-ImagesTestJSON-1315536367-project-member] Lock "b6a0771c-53cb-4503-bbc0-db992326b245" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 205.692s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.668379] env[68571]: DEBUG nova.compute.manager [None req-aa1230a2-8b02-476b-9f56-cab3486b6af9 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 10b3cea3-b9d1-45b7-9ac8-b922952371ba] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.686510] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.710761] env[68571]: DEBUG nova.compute.manager [None req-aa1230a2-8b02-476b-9f56-cab3486b6af9 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 10b3cea3-b9d1-45b7-9ac8-b922952371ba] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1384.866339] env[68571]: DEBUG nova.network.neutron [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Updating instance_info_cache with network_info: [{"id": "b8d64d34-b317-489b-91e2-55b8239349e1", "address": "fa:16:3e:7c:e7:f1", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb8d64d34-b3", "ovs_interfaceid": "b8d64d34-b317-489b-91e2-55b8239349e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1384.875154] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Releasing lock "refresh_cache-b60eb700-434f-4bea-a84f-9071402001c3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1384.875564] env[68571]: DEBUG nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1384.875771] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1384.876303] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-df354fac-f06c-4a36-8138-3582096a8917 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.885676] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-13801df7-75a4-4e88-965d-d09cbfd97bd8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1384.916706] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b60eb700-434f-4bea-a84f-9071402001c3 could not be found. [ 1384.917556] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1384.918407] env[68571]: INFO nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1384.918645] env[68571]: DEBUG oslo.service.loopingcall [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1384.921374] env[68571]: DEBUG nova.compute.manager [-] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1384.921507] env[68571]: DEBUG nova.network.neutron [-] [instance: b60eb700-434f-4bea-a84f-9071402001c3] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1384.934652] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aa1230a2-8b02-476b-9f56-cab3486b6af9 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "10b3cea3-b9d1-45b7-9ac8-b922952371ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 198.118s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1384.969778] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1384.994556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1384.994834] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1384.996451] env[68571]: INFO nova.compute.claims [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1385.028755] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1385.038295] env[68571]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1385.038295] env[68571]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1385.038295] env[68571]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-c07eb802-32fd-467b-9f3d-9c88ea1b2d11'] [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1385.039043] env[68571]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1385.039989] env[68571]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1385.040679] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1385.040679] env[68571]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1385.040679] env[68571]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.040679] env[68571]: ERROR oslo.service.loopingcall [ 1385.040679] env[68571]: ERROR nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.079428] env[68571]: ERROR nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] exception_handler_v20(status_code, error_body) [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise client_exc(message=error_message, [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Neutron server returns request_ids: ['req-c07eb802-32fd-467b-9f3d-9c88ea1b2d11'] [ 1385.079428] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] During handling of the above exception, another exception occurred: [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] Traceback (most recent call last): [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._delete_instance(context, instance, bdms) [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._shutdown_instance(context, instance, bdms) [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._try_deallocate_network(context, instance, requested_networks) [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] with excutils.save_and_reraise_exception(): [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.080569] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.force_reraise() [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise self.value [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] _deallocate_network_with_retries() [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return evt.wait() [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = hub.switch() [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.greenlet.switch() [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1385.080974] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = func(*self.args, **self.kw) [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] result = f(*args, **kwargs) [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._deallocate_network( [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self.network_api.deallocate_for_instance( [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] data = neutron.list_ports(**search_opts) [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.list('ports', self.ports_path, retrieve_all, [ 1385.081314] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] for r in self._pagination(collection, path, **params): [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] res = self.get(path, params=params) [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.retry_request("GET", action, body=body, [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1385.081673] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] return self.do_request(method, action, body=body, [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] ret = obj(*args, **kwargs) [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] self._handle_fault_response(status_code, replybody, resp) [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.086227] env[68571]: ERROR nova.compute.manager [instance: b60eb700-434f-4bea-a84f-9071402001c3] [ 1385.107224] env[68571]: DEBUG oslo_concurrency.lockutils [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Lock "b60eb700-434f-4bea-a84f-9071402001c3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.561s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.108562] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "b60eb700-434f-4bea-a84f-9071402001c3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 270.604s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1385.108758] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b60eb700-434f-4bea-a84f-9071402001c3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1385.108932] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "b60eb700-434f-4bea-a84f-9071402001c3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.218198] env[68571]: INFO nova.compute.manager [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] [instance: b60eb700-434f-4bea-a84f-9071402001c3] Successfully reverted task state from None on failure for instance. [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server [None req-47ac41bb-2e62-497d-bb75-a808d4f4ddf7 tempest-MigrationsAdminTest-1386299509 tempest-MigrationsAdminTest-1386299509-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-c07eb802-32fd-467b-9f3d-9c88ea1b2d11'] [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1385.221774] env[68571]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.222319] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1385.222828] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1385.223323] env[68571]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.223819] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1385.224320] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1385.224808] env[68571]: ERROR oslo_messaging.rpc.server [ 1385.268337] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2ea5fb0e-0157-4298-929d-46e8dda4abe6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.276141] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cae0c14f-37ff-4d3f-ad86-8217eb17a5da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.306857] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bb84d46-dbd6-43a5-9af7-8a5ffb4b39db {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.313934] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-73f04d8d-9aed-4fc0-8b03-1075271733be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.326741] env[68571]: DEBUG nova.compute.provider_tree [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1385.334823] env[68571]: DEBUG nova.scheduler.client.report [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1385.486179] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.491s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.486716] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1385.489829] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.462s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1385.491705] env[68571]: INFO nova.compute.claims [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1385.531823] env[68571]: DEBUG nova.compute.utils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1385.533530] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1385.533737] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1385.549800] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1385.601007] env[68571]: DEBUG nova.policy [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c7c7b7b944b541c1bed14364e3f20015', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3b9f46c7d8b9484dbbd1e9cd57fb6d68', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1385.627148] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1385.684071] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1385.684331] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1385.684489] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1385.684673] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1385.684818] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1385.685022] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1385.685189] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1385.685340] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1385.685511] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1385.685676] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1385.685846] env[68571]: DEBUG nova.virt.hardware [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1385.686728] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-16c034e8-6b9c-4437-953f-21f2ca1163a4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.699063] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-02b38e57-40b1-4688-b5c3-8faf040dc374 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.776809] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d435d38f-504b-4f8e-b3e3-d17d3de48385 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.783971] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a0943c2a-c5b3-4aa3-81e7-36d0c7ee694e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.813573] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1f00170-fa82-4621-9471-dedf5848fd20 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.821066] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c8e0be9-fd98-4c21-b0d5-c3811f5e7e39 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1385.836066] env[68571]: DEBUG nova.compute.provider_tree [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1385.845026] env[68571]: DEBUG nova.scheduler.client.report [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1385.862167] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.372s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1385.862674] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1385.939105] env[68571]: DEBUG nova.compute.utils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1385.940412] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1385.940589] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1385.958144] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1386.043022] env[68571]: DEBUG nova.policy [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b21fda9650f1447a81a5994f05fc8078', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '157830f5757b429383d95b2b4c0a384c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1386.046841] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1386.080953] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1386.081202] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1386.081354] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1386.081527] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1386.081669] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1386.081812] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1386.085495] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1386.085727] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1386.085908] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1386.086089] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1386.086264] env[68571]: DEBUG nova.virt.hardware [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1386.087125] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d03e7619-56db-4932-9789-d3e38e62d69d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1386.092196] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Successfully created port: 6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1386.101476] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94d5eae5-1103-440d-ba9e-c09603bb54a9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1386.536609] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Successfully created port: b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1387.212044] env[68571]: DEBUG nova.compute.manager [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Received event network-vif-plugged-6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1387.212506] env[68571]: DEBUG oslo_concurrency.lockutils [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] Acquiring lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1387.212654] env[68571]: DEBUG oslo_concurrency.lockutils [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1387.212873] env[68571]: DEBUG oslo_concurrency.lockutils [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1387.213145] env[68571]: DEBUG nova.compute.manager [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] No waiting events found dispatching network-vif-plugged-6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1387.213389] env[68571]: WARNING nova.compute.manager [req-f0522ef1-b137-47ae-97c2-477601beaf54 req-331e73bd-82c7-4791-ba4b-006c54f54571 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Received unexpected event network-vif-plugged-6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 for instance with vm_state building and task_state spawning. [ 1387.237251] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Successfully updated port: 6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1387.265862] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1387.266171] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquired lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1387.266354] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1387.323160] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1387.542975] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Updating instance_info_cache with network_info: [{"id": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "address": "fa:16:3e:ba:e4:1e", "network": {"id": "27628c43-2367-4b9f-b532-3a88d1a5696a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-68589408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b9f46c7d8b9484dbbd1e9cd57fb6d68", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7c6b8c-ac", "ovs_interfaceid": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1387.577027] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Releasing lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1387.577027] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance network_info: |[{"id": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "address": "fa:16:3e:ba:e4:1e", "network": {"id": "27628c43-2367-4b9f-b532-3a88d1a5696a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-68589408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b9f46c7d8b9484dbbd1e9cd57fb6d68", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7c6b8c-ac", "ovs_interfaceid": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1387.577216] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:ba:e4:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'ea45c024-d603-4bac-9c1b-f302437ea4fe', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '6e7c6b8c-ac61-4982-b672-fc4527bdb7a6', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1387.585405] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Creating folder: Project (3b9f46c7d8b9484dbbd1e9cd57fb6d68). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1387.587056] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37f55daa-2ba4-4747-ab2d-155ef8acd7f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1387.597974] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Created folder: Project (3b9f46c7d8b9484dbbd1e9cd57fb6d68) in parent group-v692787. [ 1387.598178] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Creating folder: Instances. Parent ref: group-v692866. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1387.598411] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e8e25883-c1e9-4d56-a048-68b781571cfd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1387.608034] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Created folder: Instances in parent group-v692866. [ 1387.608687] env[68571]: DEBUG oslo.service.loopingcall [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1387.608687] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1387.608687] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-171bbddd-fe89-44f0-ad60-99fa52e05095 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1387.630569] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1387.630569] env[68571]: value = "task-3467705" [ 1387.630569] env[68571]: _type = "Task" [ 1387.630569] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1387.638094] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467705, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1387.801224] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Successfully updated port: b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1387.828787] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1387.828890] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1387.829014] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1387.868863] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1388.051245] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Updating instance_info_cache with network_info: [{"id": "b29a225b-ea58-4186-90f4-7f22864010c9", "address": "fa:16:3e:0c:30:7c", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29a225b-ea", "ovs_interfaceid": "b29a225b-ea58-4186-90f4-7f22864010c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1388.062969] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1388.063355] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance network_info: |[{"id": "b29a225b-ea58-4186-90f4-7f22864010c9", "address": "fa:16:3e:0c:30:7c", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29a225b-ea", "ovs_interfaceid": "b29a225b-ea58-4186-90f4-7f22864010c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1388.063807] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0c:30:7c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c979f78-8597-41f8-b1de-995014032689', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'b29a225b-ea58-4186-90f4-7f22864010c9', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1388.071382] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating folder: Project (157830f5757b429383d95b2b4c0a384c). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1388.071961] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-3742ea06-35b6-441c-a656-3fccde13c59d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1388.082017] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created folder: Project (157830f5757b429383d95b2b4c0a384c) in parent group-v692787. [ 1388.082288] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating folder: Instances. Parent ref: group-v692869. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1388.082532] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a63e98e6-db0f-49df-84f3-a149d4691d84 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1388.090492] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created folder: Instances in parent group-v692869. [ 1388.090670] env[68571]: DEBUG oslo.service.loopingcall [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1388.090855] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1388.091073] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a15cf274-536b-4112-afdb-a9a9b3e81a74 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1388.110550] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1388.110550] env[68571]: value = "task-3467708" [ 1388.110550] env[68571]: _type = "Task" [ 1388.110550] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1388.117694] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467708, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1388.140900] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467705, 'name': CreateVM_Task, 'duration_secs': 0.308031} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1388.141099] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1388.141765] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1388.141933] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1388.142271] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1388.142525] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8bea16f5-0aaf-4e7d-bc36-ca53d49dcaee {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1388.147298] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for the task: (returnval){ [ 1388.147298] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b0d036-bbfd-471d-4995-5d079a1415ea" [ 1388.147298] env[68571]: _type = "Task" [ 1388.147298] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1388.155599] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b0d036-bbfd-471d-4995-5d079a1415ea, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1388.621747] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467708, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1388.656380] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1388.656642] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1388.656863] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1389.121469] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467708, 'name': CreateVM_Task} progress is 99%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1389.244910] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Received event network-changed-6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1389.245127] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Refreshing instance network info cache due to event network-changed-6e7c6b8c-ac61-4982-b672-fc4527bdb7a6. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1389.245344] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Acquiring lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1389.245540] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Acquired lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1389.245745] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Refreshing network info cache for port 6e7c6b8c-ac61-4982-b672-fc4527bdb7a6 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1389.503071] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Updated VIF entry in instance network info cache for port 6e7c6b8c-ac61-4982-b672-fc4527bdb7a6. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1389.503448] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Updating instance_info_cache with network_info: [{"id": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "address": "fa:16:3e:ba:e4:1e", "network": {"id": "27628c43-2367-4b9f-b532-3a88d1a5696a", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-68589408-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "3b9f46c7d8b9484dbbd1e9cd57fb6d68", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "ea45c024-d603-4bac-9c1b-f302437ea4fe", "external-id": "nsx-vlan-transportzone-946", "segmentation_id": 946, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap6e7c6b8c-ac", "ovs_interfaceid": "6e7c6b8c-ac61-4982-b672-fc4527bdb7a6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1389.512914] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Releasing lock "refresh_cache-47df3a07-1271-482c-bd3a-92fb9cef17bd" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1389.513190] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Received event network-vif-plugged-b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1389.513549] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Acquiring lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1389.513613] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1389.513773] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1389.513939] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] No waiting events found dispatching network-vif-plugged-b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1389.514126] env[68571]: WARNING nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Received unexpected event network-vif-plugged-b29a225b-ea58-4186-90f4-7f22864010c9 for instance with vm_state building and task_state spawning. [ 1389.514291] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Received event network-changed-b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1389.514452] env[68571]: DEBUG nova.compute.manager [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Refreshing instance network info cache due to event network-changed-b29a225b-ea58-4186-90f4-7f22864010c9. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1389.514618] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Acquiring lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1389.514851] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Acquired lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1389.514913] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Refreshing network info cache for port b29a225b-ea58-4186-90f4-7f22864010c9 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1389.622792] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467708, 'name': CreateVM_Task, 'duration_secs': 1.319312} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1389.623070] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1389.623657] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1389.623849] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1389.624205] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1389.624480] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f55f366c-17d6-4086-ad15-28a9233b937c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1389.628605] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 1389.628605] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b71e94-8b93-58d9-e887-a519078bfed2" [ 1389.628605] env[68571]: _type = "Task" [ 1389.628605] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1389.635745] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b71e94-8b93-58d9-e887-a519078bfed2, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1389.751911] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Updated VIF entry in instance network info cache for port b29a225b-ea58-4186-90f4-7f22864010c9. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1389.752289] env[68571]: DEBUG nova.network.neutron [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Updating instance_info_cache with network_info: [{"id": "b29a225b-ea58-4186-90f4-7f22864010c9", "address": "fa:16:3e:0c:30:7c", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapb29a225b-ea", "ovs_interfaceid": "b29a225b-ea58-4186-90f4-7f22864010c9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1389.765440] env[68571]: DEBUG oslo_concurrency.lockutils [req-6a8d9f69-3844-4407-9c37-b502f08280c9 req-55878255-e324-43c5-b539-564899171d60 service nova] Releasing lock "refresh_cache-73ba7761-3724-46ed-95c5-e93a6627a2d3" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1390.138962] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1390.139168] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1390.139382] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1395.505620] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1408.312240] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1421.489808] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1421.501396] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1421.501602] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1421.501765] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1421.501922] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1421.503009] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-60f32212-81a0-4517-95ba-db216fb05148 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.511474] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6dd52a27-1417-4bb0-bc64-52cf76dd3426 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.525393] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9cdf5df-864d-44ce-85f9-30e427cf2766 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.531414] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c074a32d-8f21-428d-aea4-4190005b1c46 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1421.559150] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1421.559289] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1421.559476] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1421.725246] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.725414] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.725545] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.725669] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.725793] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.725946] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.726087] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.726210] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.726327] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.726441] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1421.737315] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.747440] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4f0cfa21-d717-494c-8201-2c85dd11e512 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.756520] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.765545] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.774160] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 780d6657-20dc-4d8c-acec-0e002f79372b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.782470] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47511138-2486-46a8-85d5-081388bb0b16 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1421.782679] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1421.782828] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1421.799020] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1421.812217] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1421.812400] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1421.822663] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1421.840692] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1422.012519] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e655b7bb-ed1c-4f03-a0ff-da4e452f64c5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.020020] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aa1e61bc-bded-461e-a8ba-fff89d704132 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.049391] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c055c5f-91f6-40c3-b87c-9124357b356d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.056482] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-35babeef-a1df-41ef-8e08-5ddc1f2bee60 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1422.069206] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1422.077911] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1422.092527] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1422.092723] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.533s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1422.092941] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1422.093105] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1423.093683] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.094013] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.094199] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1423.094388] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.489697] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1425.489994] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1425.489994] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1425.511756] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.511922] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512213] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512387] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512522] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512648] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512771] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.512894] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.513029] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.513155] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1425.513278] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1426.488938] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.489208] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1426.489374] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1427.489656] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1431.490969] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1431.490969] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1431.500941] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 0 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1432.271129] env[68571]: WARNING oslo_vmware.rw_handles [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1432.271129] env[68571]: ERROR oslo_vmware.rw_handles [ 1432.271672] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1432.273446] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1432.273680] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Copying Virtual Disk [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/ffc26c8a-9234-41f1-b2ce-bd61690cf758/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1432.274016] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7acaf78f-0a08-4448-a101-b83d908d4858 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.282108] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for the task: (returnval){ [ 1432.282108] env[68571]: value = "task-3467709" [ 1432.282108] env[68571]: _type = "Task" [ 1432.282108] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.289685] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Task: {'id': task-3467709, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1432.791928] env[68571]: DEBUG oslo_vmware.exceptions [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1432.792251] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1432.792794] env[68571]: ERROR nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1432.792794] env[68571]: Faults: ['InvalidArgument'] [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Traceback (most recent call last): [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] yield resources [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self.driver.spawn(context, instance, image_meta, [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self._fetch_image_if_missing(context, vi) [ 1432.792794] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] image_cache(vi, tmp_image_ds_loc) [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] vm_util.copy_virtual_disk( [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] session._wait_for_task(vmdk_copy_task) [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return self.wait_for_task(task_ref) [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return evt.wait() [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] result = hub.switch() [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1432.793192] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return self.greenlet.switch() [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self.f(*self.args, **self.kw) [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] raise exceptions.translate_fault(task_info.error) [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Faults: ['InvalidArgument'] [ 1432.793586] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] [ 1432.793586] env[68571]: INFO nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Terminating instance [ 1432.794756] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1432.794967] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1432.795239] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-24097f79-d79b-49f6-b1c7-1523ae4f7fcd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.798467] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1432.798660] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1432.799395] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-743a25d9-a7cd-4e0a-9350-858200464030 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.806609] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1432.807586] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-c5df6b33-5c5b-43f9-9111-ff99a978cd2f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.809011] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1432.809199] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1432.809852] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-52f426dd-245d-44fc-8b57-5a4846fa485f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.815234] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 1432.815234] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52a74b20-614b-e128-d969-244953c5932f" [ 1432.815234] env[68571]: _type = "Task" [ 1432.815234] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.822529] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52a74b20-614b-e128-d969-244953c5932f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1432.885082] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1432.885306] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1432.885490] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Deleting the datastore file [datastore1] 5e571ae2-9d45-402d-bce5-6e3721cc5374 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1432.885756] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2838d00b-e2a6-4b2a-bd7b-192c8dfd2c38 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1432.891680] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for the task: (returnval){ [ 1432.891680] env[68571]: value = "task-3467711" [ 1432.891680] env[68571]: _type = "Task" [ 1432.891680] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1432.898809] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Task: {'id': task-3467711, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1433.325686] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1433.325902] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating directory with path [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1433.326171] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-110e14ac-2440-4c77-a54f-eeda63fbeb55 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.336968] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Created directory with path [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1433.337187] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Fetch image to [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1433.337360] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1433.338045] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed017230-df8c-4278-882e-90bf6d42ac27 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.344012] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edaa98fb-a8bd-4a25-9664-e8b45f759e48 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.352673] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bee57518-57b7-4e12-8649-8ff62322eae7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.383458] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-231d243c-9e80-4976-a126-fdb0a02c10e1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.388903] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-606ef230-40b8-4aa2-b133-c3964189d4f4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.399728] env[68571]: DEBUG oslo_vmware.api [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Task: {'id': task-3467711, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.061372} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1433.399960] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1433.400161] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1433.400338] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1433.400506] env[68571]: INFO nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1433.402644] env[68571]: DEBUG nova.compute.claims [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1433.402816] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1433.403037] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1433.407305] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1433.456641] env[68571]: DEBUG oslo_vmware.rw_handles [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1433.516321] env[68571]: DEBUG oslo_vmware.rw_handles [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1433.516727] env[68571]: DEBUG oslo_vmware.rw_handles [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1433.683355] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0e098569-4135-4744-aff7-e8e68e8823b5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.690433] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba3e4b6d-63f0-4df1-9b55-bba9db70792d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.720931] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1b14549b-f28f-41f3-a3ec-730430c1f587 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.727864] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb0f221f-d69b-4db0-b05d-d1a7fe14cfaf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1433.740850] env[68571]: DEBUG nova.compute.provider_tree [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1433.749085] env[68571]: DEBUG nova.scheduler.client.report [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1433.763034] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.360s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1433.763458] env[68571]: ERROR nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1433.763458] env[68571]: Faults: ['InvalidArgument'] [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Traceback (most recent call last): [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self.driver.spawn(context, instance, image_meta, [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self._fetch_image_if_missing(context, vi) [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] image_cache(vi, tmp_image_ds_loc) [ 1433.763458] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] vm_util.copy_virtual_disk( [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] session._wait_for_task(vmdk_copy_task) [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return self.wait_for_task(task_ref) [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return evt.wait() [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] result = hub.switch() [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] return self.greenlet.switch() [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1433.763837] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] self.f(*self.args, **self.kw) [ 1433.764262] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1433.764262] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] raise exceptions.translate_fault(task_info.error) [ 1433.764262] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1433.764262] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Faults: ['InvalidArgument'] [ 1433.764262] env[68571]: ERROR nova.compute.manager [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] [ 1433.764262] env[68571]: DEBUG nova.compute.utils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1433.765587] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Build of instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 was re-scheduled: A specified parameter was not correct: fileType [ 1433.765587] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1433.765984] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1433.766190] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1433.766369] env[68571]: DEBUG nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1433.766537] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1434.178836] env[68571]: DEBUG nova.network.neutron [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1434.188912] env[68571]: INFO nova.compute.manager [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Took 0.42 seconds to deallocate network for instance. [ 1434.299946] env[68571]: INFO nova.scheduler.client.report [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Deleted allocations for instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 [ 1434.331377] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd2dfccc-64f6-4c1e-8a6c-1a508736a0e4 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 630.523s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.332629] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 433.917s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1434.332885] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Acquiring lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1434.333112] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1434.333235] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.335398] env[68571]: INFO nova.compute.manager [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Terminating instance [ 1434.337501] env[68571]: DEBUG nova.compute.manager [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1434.337693] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1434.338500] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-6a560e68-57c8-495d-89bc-b129b744b7d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.346759] env[68571]: DEBUG nova.compute.manager [None req-f7c6b8ad-9819-4120-9a74-961394b05463 tempest-AttachInterfacesUnderV243Test-157106225 tempest-AttachInterfacesUnderV243Test-157106225-project-member] [instance: 36548949-5053-4f4c-a0ca-ac5487a6cf14] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1434.352127] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-94c8546d-21a4-4545-ade3-0e19ad20b6d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.370873] env[68571]: DEBUG nova.compute.manager [None req-f7c6b8ad-9819-4120-9a74-961394b05463 tempest-AttachInterfacesUnderV243Test-157106225 tempest-AttachInterfacesUnderV243Test-157106225-project-member] [instance: 36548949-5053-4f4c-a0ca-ac5487a6cf14] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1434.381466] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5e571ae2-9d45-402d-bce5-6e3721cc5374 could not be found. [ 1434.381666] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1434.381839] env[68571]: INFO nova.compute.manager [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1434.382090] env[68571]: DEBUG oslo.service.loopingcall [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1434.382316] env[68571]: DEBUG nova.compute.manager [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1434.382412] env[68571]: DEBUG nova.network.neutron [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1434.398202] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f7c6b8ad-9819-4120-9a74-961394b05463 tempest-AttachInterfacesUnderV243Test-157106225 tempest-AttachInterfacesUnderV243Test-157106225-project-member] Lock "36548949-5053-4f4c-a0ca-ac5487a6cf14" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 218.656s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.405953] env[68571]: DEBUG nova.network.neutron [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1434.413839] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1434.416419] env[68571]: INFO nova.compute.manager [-] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] Took 0.03 seconds to deallocate network for instance. [ 1434.465382] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1434.465643] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1434.467066] env[68571]: INFO nova.compute.claims [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1434.489292] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1434.513810] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6a58fc7b-70fb-4642-9878-02de6e811bb1 tempest-ServersNegativeTestMultiTenantJSON-339124010 tempest-ServersNegativeTestMultiTenantJSON-339124010-project-member] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.181s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.515155] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 320.010s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1434.515400] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5e571ae2-9d45-402d-bce5-6e3721cc5374] During sync_power_state the instance has a pending task (deleting). Skip. [ 1434.515527] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "5e571ae2-9d45-402d-bce5-6e3721cc5374" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.677827] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3af2f9e4-1330-4223-b939-92be7c47ca9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.685503] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f23f7665-dabd-419a-98b4-d43a962561a7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.715994] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1970e31a-a405-4c83-b3ea-3315896c0aeb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.723516] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ffd954b1-febd-411b-b465-21b31095499c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.738024] env[68571]: DEBUG nova.compute.provider_tree [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1434.745642] env[68571]: DEBUG nova.scheduler.client.report [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1434.758355] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.293s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1434.758825] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1434.790025] env[68571]: DEBUG nova.compute.utils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1434.790825] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1434.791010] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1434.802308] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1434.865029] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1434.884897] env[68571]: DEBUG nova.policy [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56f65d96057541579b651f191cd888af', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b8702238ce1d4f43a38c62aa706f585f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1434.889583] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1434.889862] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1434.890072] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1434.890305] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1434.890492] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1434.890677] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1434.890923] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1434.891140] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1434.891356] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1434.891559] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1434.891769] env[68571]: DEBUG nova.virt.hardware [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1434.892748] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-57f486cd-d70c-49fe-81a5-8c9a65857a7d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1434.901050] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1612df6c-473d-4645-b02e-f2f4862f5b9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1435.218441] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Successfully created port: 2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1435.803618] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Successfully updated port: 2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1435.817053] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1435.817333] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquired lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1435.817512] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1435.862568] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1436.025957] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Updating instance_info_cache with network_info: [{"id": "2427c905-64de-4cd8-974d-a1909dec4136", "address": "fa:16:3e:af:22:f6", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2427c905-64", "ovs_interfaceid": "2427c905-64de-4cd8-974d-a1909dec4136", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1436.038645] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Releasing lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1436.038954] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance network_info: |[{"id": "2427c905-64de-4cd8-974d-a1909dec4136", "address": "fa:16:3e:af:22:f6", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2427c905-64", "ovs_interfaceid": "2427c905-64de-4cd8-974d-a1909dec4136", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1436.039367] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:af:22:f6', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': 'f3d7e184-c87f-47a5-8d0d-9fa20e07e669', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '2427c905-64de-4cd8-974d-a1909dec4136', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1436.047009] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Creating folder: Project (b8702238ce1d4f43a38c62aa706f585f). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1436.047545] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-db32346b-e48b-4b62-8df0-0f5e6f25da3d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.057288] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Created folder: Project (b8702238ce1d4f43a38c62aa706f585f) in parent group-v692787. [ 1436.057469] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Creating folder: Instances. Parent ref: group-v692872. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1436.057688] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-37fb0d10-c6ac-4f74-a62f-d9d089b7fea9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.065674] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Created folder: Instances in parent group-v692872. [ 1436.065889] env[68571]: DEBUG oslo.service.loopingcall [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1436.066096] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1436.066288] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-adcfd6c5-0ea6-49b0-82e5-06753a33d0d8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.083921] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1436.083921] env[68571]: value = "task-3467714" [ 1436.083921] env[68571]: _type = "Task" [ 1436.083921] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1436.090711] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467714, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1436.212854] env[68571]: DEBUG nova.compute.manager [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Received event network-vif-plugged-2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1436.212999] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Acquiring lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1436.213237] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1436.213409] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1436.213766] env[68571]: DEBUG nova.compute.manager [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] No waiting events found dispatching network-vif-plugged-2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1436.213835] env[68571]: WARNING nova.compute.manager [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Received unexpected event network-vif-plugged-2427c905-64de-4cd8-974d-a1909dec4136 for instance with vm_state building and task_state spawning. [ 1436.214052] env[68571]: DEBUG nova.compute.manager [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Received event network-changed-2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1436.214148] env[68571]: DEBUG nova.compute.manager [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Refreshing instance network info cache due to event network-changed-2427c905-64de-4cd8-974d-a1909dec4136. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1436.214454] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Acquiring lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1436.214733] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Acquired lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1436.214733] env[68571]: DEBUG nova.network.neutron [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Refreshing network info cache for port 2427c905-64de-4cd8-974d-a1909dec4136 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1436.593961] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467714, 'name': CreateVM_Task, 'duration_secs': 0.309423} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1436.594338] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1436.594794] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1436.594957] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1436.595296] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1436.595547] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-a1d2b09b-4731-4b65-aa9a-336877e50435 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1436.601103] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Waiting for the task: (returnval){ [ 1436.601103] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52c9a0c2-b2d1-ff42-47ea-3dc21ada8228" [ 1436.601103] env[68571]: _type = "Task" [ 1436.601103] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1436.608787] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52c9a0c2-b2d1-ff42-47ea-3dc21ada8228, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1436.657224] env[68571]: DEBUG nova.network.neutron [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Updated VIF entry in instance network info cache for port 2427c905-64de-4cd8-974d-a1909dec4136. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1436.657584] env[68571]: DEBUG nova.network.neutron [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Updating instance_info_cache with network_info: [{"id": "2427c905-64de-4cd8-974d-a1909dec4136", "address": "fa:16:3e:af:22:f6", "network": {"id": "802e91c0-b497-4996-a9a8-0fb2969a1fd5", "bridge": "br-int", "label": "shared", "subnets": [{"cidr": "192.168.233.0/24", "dns": [], "gateway": {"address": "192.168.233.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.233.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.233.2"}}], "meta": {"injected": false, "tenant_id": "129da41d4b1a4202be57f86562f628cb", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "f3d7e184-c87f-47a5-8d0d-9fa20e07e669", "external-id": "nsx-vlan-transportzone-746", "segmentation_id": 746, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap2427c905-64", "ovs_interfaceid": "2427c905-64de-4cd8-974d-a1909dec4136", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1436.666770] env[68571]: DEBUG oslo_concurrency.lockutils [req-fa07cc83-3e5d-478a-b375-d5aa647a73a1 req-08baf52d-04e9-422b-857a-c6506a71d6a5 service nova] Releasing lock "refresh_cache-d890a035-a14e-4be0-97c8-87edd9bb88e4" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1437.111568] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1437.111782] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1437.111992] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1447.531750] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.498797] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1481.510553] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.510795] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1481.511010] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1481.511218] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1481.512433] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4743611a-ff31-44ff-9704-19bbc3dee605 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.520989] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-354606f5-3016-4686-9111-6aa374a31334 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.534595] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9227a303-9f5a-4fae-952a-bd99439b3a36 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.540984] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6d1b5ace-f8ec-4fd0-b442-98ebcd1c61b6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.568722] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180921MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1481.568863] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1481.569065] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1481.639833] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640012] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640145] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640267] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640385] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640500] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640616] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640732] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640845] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.640956] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1481.651484] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1481.661363] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1481.670334] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 780d6657-20dc-4d8c-acec-0e002f79372b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1481.679860] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47511138-2486-46a8-85d5-081388bb0b16 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1481.680083] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1481.680232] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1481.829832] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-526580e9-08b8-4c82-ba41-c4c5247b544e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.837265] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbbc0cb8-3e91-435d-9749-3c65bdd3ad17 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.867410] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cb4fb11-6432-4868-800d-ce5f417deef5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.874192] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-29401aa7-90a8-42ee-920f-357c56914ea1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1481.886758] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1481.894748] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1481.910251] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1481.910428] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.341s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1482.901786] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1482.901786] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1482.901786] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1483.434225] env[68571]: WARNING oslo_vmware.rw_handles [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1483.434225] env[68571]: ERROR oslo_vmware.rw_handles [ 1483.434764] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1483.436687] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1483.436953] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Copying Virtual Disk [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/63e44eaa-231b-4647-a81a-b19c08b2940e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1483.437241] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-85692da1-cf9b-49ca-b587-d3e5e0c6c1ea {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.444407] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 1483.444407] env[68571]: value = "task-3467715" [ 1483.444407] env[68571]: _type = "Task" [ 1483.444407] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1483.452216] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467715, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1483.484825] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1483.955449] env[68571]: DEBUG oslo_vmware.exceptions [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1483.955449] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1483.955996] env[68571]: ERROR nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1483.955996] env[68571]: Faults: ['InvalidArgument'] [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] yield resources [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.driver.spawn(context, instance, image_meta, [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._fetch_image_if_missing(context, vi) [ 1483.955996] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] image_cache(vi, tmp_image_ds_loc) [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] vm_util.copy_virtual_disk( [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] session._wait_for_task(vmdk_copy_task) [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.wait_for_task(task_ref) [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return evt.wait() [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = hub.switch() [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1483.956385] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.greenlet.switch() [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.f(*self.args, **self.kw) [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exceptions.translate_fault(task_info.error) [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Faults: ['InvalidArgument'] [ 1483.956777] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1483.956777] env[68571]: INFO nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Terminating instance [ 1483.957917] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1483.958137] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1483.958379] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-59183089-f6ab-4401-87e1-37170b1f9ddd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.960476] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1483.960673] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1483.961406] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edbaba8d-4871-42ca-9e4d-947188e5bf4a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.968288] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1483.969211] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-a471e056-ba94-459e-9f95-a2014ec79b43 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.970531] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1483.970705] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1483.971362] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-39661bb0-2cc2-4aa0-b8f5-0fbd222be6da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1483.976061] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for the task: (returnval){ [ 1483.976061] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52431dbf-acfe-8971-8d0c-34d7d4ddb36b" [ 1483.976061] env[68571]: _type = "Task" [ 1483.976061] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1483.988638] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52431dbf-acfe-8971-8d0c-34d7d4ddb36b, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1484.043332] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1484.043536] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1484.043709] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleting the datastore file [datastore1] 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1484.044038] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c0f1bd39-ecfc-44d9-a289-b8c2961e5b98 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.050230] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for the task: (returnval){ [ 1484.050230] env[68571]: value = "task-3467717" [ 1484.050230] env[68571]: _type = "Task" [ 1484.050230] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1484.058716] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467717, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1484.486815] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1484.487039] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Creating directory with path [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1484.487274] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-bb3445fe-cc1e-437f-b253-7ec0cabb1bac {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.498373] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Created directory with path [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1484.498546] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Fetch image to [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1484.498713] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1484.499449] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56940465-05fc-4f4a-980f-1c17beb2c804 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.505507] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0dbbf940-88c7-49be-90cf-aa2e92ea65cb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.514298] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f65369-48a1-4c87-bad5-1934da8798f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.544521] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5f76712a-6c2d-402e-b4ea-3c07d9c953b5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.549988] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-2a496191-23b8-4fb5-bb1d-87e5fced72bf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.558769] env[68571]: DEBUG oslo_vmware.api [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Task: {'id': task-3467717, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.058932} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1484.559010] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1484.559207] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1484.559378] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1484.559554] env[68571]: INFO nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1484.561619] env[68571]: DEBUG nova.compute.claims [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1484.561793] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1484.562016] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1484.572870] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1484.629110] env[68571]: DEBUG oslo_vmware.rw_handles [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1484.687902] env[68571]: DEBUG oslo_vmware.rw_handles [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1484.688092] env[68571]: DEBUG oslo_vmware.rw_handles [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1484.814630] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37e532bc-2ae4-4121-aa5d-74aa045f9806 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.822537] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0f53b656-50ce-446a-840b-f8639441e1c9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.854600] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ffbfbf9-af96-4c9d-9f19-4b6bb7afb003 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.861986] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7c54a42-50b8-4811-9a24-644080f5a6f0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1484.875094] env[68571]: DEBUG nova.compute.provider_tree [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1484.883759] env[68571]: DEBUG nova.scheduler.client.report [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1484.901987] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.338s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1484.901987] env[68571]: ERROR nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1484.901987] env[68571]: Faults: ['InvalidArgument'] [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.driver.spawn(context, instance, image_meta, [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1484.901987] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._fetch_image_if_missing(context, vi) [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] image_cache(vi, tmp_image_ds_loc) [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] vm_util.copy_virtual_disk( [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] session._wait_for_task(vmdk_copy_task) [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.wait_for_task(task_ref) [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return evt.wait() [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = hub.switch() [ 1484.902453] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.greenlet.switch() [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.f(*self.args, **self.kw) [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exceptions.translate_fault(task_info.error) [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Faults: ['InvalidArgument'] [ 1484.902821] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1484.902821] env[68571]: DEBUG nova.compute.utils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1484.904083] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Build of instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 was re-scheduled: A specified parameter was not correct: fileType [ 1484.904083] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1484.904202] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1484.904899] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1484.904899] env[68571]: DEBUG nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1484.904899] env[68571]: DEBUG nova.network.neutron [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1485.027018] env[68571]: DEBUG neutronclient.v2_0.client [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1485.028106] env[68571]: ERROR nova.compute.manager [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.driver.spawn(context, instance, image_meta, [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._fetch_image_if_missing(context, vi) [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] image_cache(vi, tmp_image_ds_loc) [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1485.028106] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] vm_util.copy_virtual_disk( [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] session._wait_for_task(vmdk_copy_task) [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.wait_for_task(task_ref) [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return evt.wait() [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = hub.switch() [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.greenlet.switch() [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.f(*self.args, **self.kw) [ 1485.028486] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exceptions.translate_fault(task_info.error) [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Faults: ['InvalidArgument'] [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] During handling of the above exception, another exception occurred: [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._build_and_run_instance(context, instance, image, [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exception.RescheduledException( [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] nova.exception.RescheduledException: Build of instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 was re-scheduled: A specified parameter was not correct: fileType [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Faults: ['InvalidArgument'] [ 1485.028850] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] During handling of the above exception, another exception occurred: [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] exception_handler_v20(status_code, error_body) [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise client_exc(message=error_message, [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Neutron server returns request_ids: ['req-5a26adc7-d833-4533-a3db-628a793a4cfa'] [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.029259] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] During handling of the above exception, another exception occurred: [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._deallocate_network(context, instance, requested_networks) [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.network_api.deallocate_for_instance( [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] data = neutron.list_ports(**search_opts) [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.list('ports', self.ports_path, retrieve_all, [ 1485.029636] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] for r in self._pagination(collection, path, **params): [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] res = self.get(path, params=params) [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.retry_request("GET", action, body=body, [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1485.030014] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.do_request(method, action, body=body, [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._handle_fault_response(status_code, replybody, resp) [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exception.Unauthorized() [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] nova.exception.Unauthorized: Not authorized. [ 1485.030377] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.081925] env[68571]: INFO nova.scheduler.client.report [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Deleted allocations for instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 [ 1485.102318] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b5a55098-7bf3-4b1f-ac12-5735e815a419 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 634.295s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.103358] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 438.725s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.103586] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Acquiring lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.103789] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.103955] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.105841] env[68571]: INFO nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Terminating instance [ 1485.107633] env[68571]: DEBUG nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1485.107879] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1485.108283] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c0b83ed0-e55a-441e-b909-483ac8832cb5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.112016] env[68571]: DEBUG nova.compute.manager [None req-16366ddb-2298-4c09-8782-8d1a9c4ad86a tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: 4f0cfa21-d717-494c-8201-2c85dd11e512] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1485.118025] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f45719e-e601-4203-a25b-29afe6a8a988 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.134279] env[68571]: DEBUG nova.compute.manager [None req-16366ddb-2298-4c09-8782-8d1a9c4ad86a tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: 4f0cfa21-d717-494c-8201-2c85dd11e512] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1485.145731] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5 could not be found. [ 1485.145924] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1485.146113] env[68571]: INFO nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1485.146351] env[68571]: DEBUG oslo.service.loopingcall [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1485.147027] env[68571]: DEBUG nova.compute.manager [-] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1485.147137] env[68571]: DEBUG nova.network.neutron [-] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1485.160432] env[68571]: DEBUG oslo_concurrency.lockutils [None req-16366ddb-2298-4c09-8782-8d1a9c4ad86a tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "4f0cfa21-d717-494c-8201-2c85dd11e512" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 233.839s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.170169] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1485.217062] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1485.217319] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.219372] env[68571]: INFO nova.compute.claims [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1485.244385] env[68571]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1485.244823] env[68571]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-a0099106-5320-4801-a41a-5eb346ae3f91'] [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1485.245687] env[68571]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1485.246134] env[68571]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.246602] env[68571]: ERROR oslo.service.loopingcall [ 1485.247021] env[68571]: ERROR nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.278214] env[68571]: ERROR nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] exception_handler_v20(status_code, error_body) [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise client_exc(message=error_message, [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1485.278214] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Neutron server returns request_ids: ['req-a0099106-5320-4801-a41a-5eb346ae3f91'] [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] During handling of the above exception, another exception occurred: [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Traceback (most recent call last): [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._delete_instance(context, instance, bdms) [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._shutdown_instance(context, instance, bdms) [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._try_deallocate_network(context, instance, requested_networks) [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] with excutils.save_and_reraise_exception(): [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.278581] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.force_reraise() [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise self.value [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] _deallocate_network_with_retries() [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return evt.wait() [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = hub.switch() [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.greenlet.switch() [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1485.279032] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = func(*self.args, **self.kw) [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] result = f(*args, **kwargs) [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._deallocate_network( [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self.network_api.deallocate_for_instance( [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] data = neutron.list_ports(**search_opts) [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.list('ports', self.ports_path, retrieve_all, [ 1485.279382] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] for r in self._pagination(collection, path, **params): [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] res = self.get(path, params=params) [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.retry_request("GET", action, body=body, [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1485.279845] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] return self.do_request(method, action, body=body, [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] ret = obj(*args, **kwargs) [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] self._handle_fault_response(status_code, replybody, resp) [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.280191] env[68571]: ERROR nova.compute.manager [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] [ 1485.306977] env[68571]: DEBUG oslo_concurrency.lockutils [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.204s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.308169] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 370.803s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1485.308359] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] During sync_power_state the instance has a pending task (deleting). Skip. [ 1485.308531] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.365522] env[68571]: INFO nova.compute.manager [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] [instance: 87187d73-5cbe-4d95-b3d6-ecbcb4fe4fa5] Successfully reverted task state from None on failure for instance. [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server [None req-64c860a2-0207-49e2-ac05-932c0fa66b26 tempest-DeleteServersAdminTestJSON-1047699960 tempest-DeleteServersAdminTestJSON-1047699960-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-a0099106-5320-4801-a41a-5eb346ae3f91'] [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1485.369386] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1485.369874] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1485.370402] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1485.370835] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1485.371322] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1485.371809] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1485.372382] env[68571]: ERROR oslo_messaging.rpc.server [ 1485.421816] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ce2d7867-3ffe-4f00-9b86-8566a2c6c89a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.429683] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3960bb38-c5e7-44eb-8d46-32555853efd0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.458341] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-97cfc149-bdaa-4acc-809b-4dc9cdea1775 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.465143] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d35d508b-4eb7-441a-9eb5-616df9bb5cf2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.478729] env[68571]: DEBUG nova.compute.provider_tree [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1485.488648] env[68571]: DEBUG nova.scheduler.client.report [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1485.503323] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.286s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1485.503756] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1485.534824] env[68571]: DEBUG nova.compute.utils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1485.536323] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1485.536534] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1485.545029] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1485.604219] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1485.608057] env[68571]: DEBUG nova.policy [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd34e5361b36c4dc5824b0f42a37e6bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '290427ab03f446ce9297ea393c083ff9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1485.629829] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1485.630130] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1485.630349] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1485.630575] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1485.630758] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1485.630936] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1485.631168] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1485.631332] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1485.631498] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1485.631725] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1485.631837] env[68571]: DEBUG nova.virt.hardware [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1485.632683] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6cb4ba8e-3e07-4952-8fb8-df9a52984be3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.640391] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-835dc79a-dbfc-4de7-99db-eb0dede1b931 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1485.904432] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Successfully created port: 61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1486.436981] env[68571]: DEBUG nova.compute.manager [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Received event network-vif-plugged-61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1486.437242] env[68571]: DEBUG oslo_concurrency.lockutils [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] Acquiring lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1486.437458] env[68571]: DEBUG oslo_concurrency.lockutils [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1486.437647] env[68571]: DEBUG oslo_concurrency.lockutils [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1486.437921] env[68571]: DEBUG nova.compute.manager [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] No waiting events found dispatching network-vif-plugged-61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1486.438150] env[68571]: WARNING nova.compute.manager [req-2bb5a17b-f012-4fa8-a9e6-bc38c43494f5 req-93fbd947-e8f8-48bd-a6d7-b15ea565f4bb service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Received unexpected event network-vif-plugged-61035f81-c770-4633-9368-8951cc2dbeb8 for instance with vm_state building and task_state spawning. [ 1486.522751] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Successfully updated port: 61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1486.539909] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1486.539982] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1486.540101] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1486.576337] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1486.732156] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Updating instance_info_cache with network_info: [{"id": "61035f81-c770-4633-9368-8951cc2dbeb8", "address": "fa:16:3e:79:a2:3c", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61035f81-c7", "ovs_interfaceid": "61035f81-c770-4633-9368-8951cc2dbeb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1486.742612] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1486.742896] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance network_info: |[{"id": "61035f81-c770-4633-9368-8951cc2dbeb8", "address": "fa:16:3e:79:a2:3c", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61035f81-c7", "ovs_interfaceid": "61035f81-c770-4633-9368-8951cc2dbeb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1486.743350] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:79:a2:3c', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '61035f81-c770-4633-9368-8951cc2dbeb8', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1486.750887] env[68571]: DEBUG oslo.service.loopingcall [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1486.751332] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1486.751556] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-d5155aa3-d30a-43e7-b72b-33a14e7d5d47 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1486.772215] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1486.772215] env[68571]: value = "task-3467718" [ 1486.772215] env[68571]: _type = "Task" [ 1486.772215] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1486.780692] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467718, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1487.282775] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467718, 'name': CreateVM_Task, 'duration_secs': 0.309465} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1487.282963] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1487.283649] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1487.283819] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1487.284143] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1487.284391] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4d40ee83-eb56-4b11-bb9a-51c2df2b9f72 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1487.289177] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1487.289177] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52672ed3-a048-2e9e-5a75-a5638251b28e" [ 1487.289177] env[68571]: _type = "Task" [ 1487.289177] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1487.297295] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52672ed3-a048-2e9e-5a75-a5638251b28e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1487.488624] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.488903] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1487.488975] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1487.510232] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.510396] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.510531] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.510741] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.510902] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511040] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511166] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511285] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511404] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511523] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1487.511645] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1487.512130] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.512318] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1487.512459] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1487.798803] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1487.799066] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1487.799276] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1488.463383] env[68571]: DEBUG nova.compute.manager [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Received event network-changed-61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1488.463634] env[68571]: DEBUG nova.compute.manager [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Refreshing instance network info cache due to event network-changed-61035f81-c770-4633-9368-8951cc2dbeb8. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1488.463888] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] Acquiring lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1488.464076] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] Acquired lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1488.465044] env[68571]: DEBUG nova.network.neutron [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Refreshing network info cache for port 61035f81-c770-4633-9368-8951cc2dbeb8 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1488.697334] env[68571]: DEBUG nova.network.neutron [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Updated VIF entry in instance network info cache for port 61035f81-c770-4633-9368-8951cc2dbeb8. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1488.697804] env[68571]: DEBUG nova.network.neutron [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Updating instance_info_cache with network_info: [{"id": "61035f81-c770-4633-9368-8951cc2dbeb8", "address": "fa:16:3e:79:a2:3c", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap61035f81-c7", "ovs_interfaceid": "61035f81-c770-4633-9368-8951cc2dbeb8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1488.707252] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e29569b-1f17-450d-a3d2-7eb0b14df173 req-fbf59bee-5629-4b4c-ad76-5a6b2096fac7 service nova] Releasing lock "refresh_cache-9e8c8d14-144f-42e3-8556-796651b7b04f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1489.489640] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1492.484603] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1505.027607] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1505.027973] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1533.448483] env[68571]: WARNING oslo_vmware.rw_handles [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1533.448483] env[68571]: ERROR oslo_vmware.rw_handles [ 1533.449197] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1533.450890] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1533.451169] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Copying Virtual Disk [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/cbcaef0a-9335-4448-841a-45486ae73eaa/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1533.451488] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ed113d82-a111-4ea2-93b7-8744baae9de9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.459842] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for the task: (returnval){ [ 1533.459842] env[68571]: value = "task-3467719" [ 1533.459842] env[68571]: _type = "Task" [ 1533.459842] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1533.467396] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Task: {'id': task-3467719, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1533.970493] env[68571]: DEBUG oslo_vmware.exceptions [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1533.970789] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1533.971370] env[68571]: ERROR nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1533.971370] env[68571]: Faults: ['InvalidArgument'] [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Traceback (most recent call last): [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] yield resources [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self.driver.spawn(context, instance, image_meta, [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self._fetch_image_if_missing(context, vi) [ 1533.971370] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] image_cache(vi, tmp_image_ds_loc) [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] vm_util.copy_virtual_disk( [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] session._wait_for_task(vmdk_copy_task) [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return self.wait_for_task(task_ref) [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return evt.wait() [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] result = hub.switch() [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1533.971736] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return self.greenlet.switch() [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self.f(*self.args, **self.kw) [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] raise exceptions.translate_fault(task_info.error) [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Faults: ['InvalidArgument'] [ 1533.972108] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] [ 1533.972108] env[68571]: INFO nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Terminating instance [ 1533.973950] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1533.974178] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1533.974884] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1533.975150] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1533.975438] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f5f66b96-b4f0-4b18-bb26-6a3c5c1bcafe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.977993] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50a6e7e4-61dd-4021-abfd-087ef446f844 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.985578] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1533.986640] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-0a5763da-4517-43c9-a419-0056901234df {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.988119] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1533.988347] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1533.989034] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5cdafabe-0dca-4c07-8bc0-131d0afca9e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1533.993795] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1533.993795] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ff4a4d-09fe-9ef1-056c-264bbb42ffbf" [ 1533.993795] env[68571]: _type = "Task" [ 1533.993795] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1534.003571] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ff4a4d-09fe-9ef1-056c-264bbb42ffbf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1534.056453] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1534.056592] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1534.056683] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Deleting the datastore file [datastore1] b90ac11a-50c6-4d12-a545-ccd92243e6ca {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1534.057079] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-a42f1639-20db-4607-98c7-3e4c6df3f860 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.062770] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for the task: (returnval){ [ 1534.062770] env[68571]: value = "task-3467721" [ 1534.062770] env[68571]: _type = "Task" [ 1534.062770] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1534.070409] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Task: {'id': task-3467721, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1534.504371] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1534.504680] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1534.504866] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa21fc58-c435-429a-93b1-2e43dc095c91 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.516354] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1534.516565] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Fetch image to [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1534.516750] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1534.517720] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ab6d06ec-0e81-4fb1-8e9e-6fa9e04e752f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.523979] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-278a77d8-96f6-4990-8a7d-90421a90d7c6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.532980] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bb27b6d3-eca4-4450-a7cf-07ccbf2335e6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.562132] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b354433c-d211-44eb-93a4-ed668c21aed2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.572125] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b78aa428-8200-4925-9efc-2f30b4c9ae3b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.573743] env[68571]: DEBUG oslo_vmware.api [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Task: {'id': task-3467721, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073034} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1534.573975] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1534.574172] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1534.574380] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1534.574509] env[68571]: INFO nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1534.576528] env[68571]: DEBUG nova.compute.claims [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1534.576692] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1534.576900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1534.593704] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1534.642710] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1534.701684] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1534.701867] env[68571]: DEBUG oslo_vmware.rw_handles [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1534.822218] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-acece188-a3c1-4f8d-bc96-78474e0315fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.829620] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd29d64-d524-45ab-896e-0d4711d82fcd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.858920] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f5edd01-3136-4954-9b13-8031ad309827 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.865228] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1264886-aaf8-48ef-83d8-f34a63f6c49b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1534.877636] env[68571]: DEBUG nova.compute.provider_tree [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1534.885724] env[68571]: DEBUG nova.scheduler.client.report [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1534.898621] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.322s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1534.899122] env[68571]: ERROR nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1534.899122] env[68571]: Faults: ['InvalidArgument'] [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Traceback (most recent call last): [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self.driver.spawn(context, instance, image_meta, [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self._fetch_image_if_missing(context, vi) [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] image_cache(vi, tmp_image_ds_loc) [ 1534.899122] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] vm_util.copy_virtual_disk( [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] session._wait_for_task(vmdk_copy_task) [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return self.wait_for_task(task_ref) [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return evt.wait() [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] result = hub.switch() [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] return self.greenlet.switch() [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1534.899547] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] self.f(*self.args, **self.kw) [ 1534.899912] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1534.899912] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] raise exceptions.translate_fault(task_info.error) [ 1534.899912] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1534.899912] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Faults: ['InvalidArgument'] [ 1534.899912] env[68571]: ERROR nova.compute.manager [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] [ 1534.899912] env[68571]: DEBUG nova.compute.utils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1534.901092] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Build of instance b90ac11a-50c6-4d12-a545-ccd92243e6ca was re-scheduled: A specified parameter was not correct: fileType [ 1534.901092] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1534.901458] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1534.901629] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1534.901796] env[68571]: DEBUG nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1534.901953] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1535.212008] env[68571]: DEBUG nova.network.neutron [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1535.224244] env[68571]: INFO nova.compute.manager [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Took 0.32 seconds to deallocate network for instance. [ 1535.318538] env[68571]: INFO nova.scheduler.client.report [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Deleted allocations for instance b90ac11a-50c6-4d12-a545-ccd92243e6ca [ 1535.344371] env[68571]: DEBUG oslo_concurrency.lockutils [None req-8e067d27-aba1-4ac2-b0e4-dcba3b753684 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 591.473s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.345657] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 420.840s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.345759] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] During sync_power_state the instance has a pending task (spawning). Skip. [ 1535.345894] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.346556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 395.770s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.346773] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Acquiring lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1535.346977] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.347236] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.349697] env[68571]: INFO nova.compute.manager [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Terminating instance [ 1535.351388] env[68571]: DEBUG nova.compute.manager [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1535.351550] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1535.351849] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-2ef94e8e-84f3-4c6b-83af-871f0c7033d1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.356805] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1535.363221] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4a899c1b-e23e-4962-bf69-5ba9c53a26e9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.392843] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance b90ac11a-50c6-4d12-a545-ccd92243e6ca could not be found. [ 1535.393077] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1535.393258] env[68571]: INFO nova.compute.manager [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1535.393682] env[68571]: DEBUG oslo.service.loopingcall [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1535.395757] env[68571]: DEBUG nova.compute.manager [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1535.395867] env[68571]: DEBUG nova.network.neutron [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1535.412308] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1535.412552] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1535.414066] env[68571]: INFO nova.compute.claims [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1535.424267] env[68571]: DEBUG nova.network.neutron [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1535.434537] env[68571]: INFO nova.compute.manager [-] [instance: b90ac11a-50c6-4d12-a545-ccd92243e6ca] Took 0.04 seconds to deallocate network for instance. [ 1535.555831] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a82277c5-4713-4316-af14-7eb2399b45d5 tempest-ServersNegativeTestJSON-1212413645 tempest-ServersNegativeTestJSON-1212413645-project-member] Lock "b90ac11a-50c6-4d12-a545-ccd92243e6ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.209s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.642810] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4712546e-cbcc-4243-9d19-db9122fb17a1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.650447] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ae0828-70be-442f-bd62-6decbf147ad4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.680085] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a60a027-6ed4-49cb-b12e-186c34f43802 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.687078] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dc0f1a72-b574-4795-b349-0fd96e747a67 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.699719] env[68571]: DEBUG nova.compute.provider_tree [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1535.707906] env[68571]: DEBUG nova.scheduler.client.report [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1535.721301] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.309s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1535.721766] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1535.754169] env[68571]: DEBUG nova.compute.utils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1535.755624] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1535.755796] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1535.763673] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1535.822271] env[68571]: DEBUG nova.policy [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3cc047c0063e4c40bda227bb3291e6ef', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3c89544f08b3493a868364ef7726d992', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1535.825329] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1535.849636] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1535.850036] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1535.850230] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1535.850424] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1535.850946] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1535.850946] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1535.851128] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1535.851186] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1535.851353] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1535.851539] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1535.851709] env[68571]: DEBUG nova.virt.hardware [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1535.852552] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-804eaf2e-ff55-4f3a-8784-56fb7792cf2a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1535.860647] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c69b0205-8548-4331-9fe8-87d96976d1be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.105394] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Successfully created port: 57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1536.661592] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Successfully updated port: 57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1536.672364] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1536.672713] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquired lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1536.672713] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1536.711393] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1536.857357] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Updating instance_info_cache with network_info: [{"id": "57168d0a-d87f-4584-b9e8-be6837c29d51", "address": "fa:16:3e:b5:d2:56", "network": {"id": "f5355e81-1889-404d-8808-a8cbc8245e73", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-607972824-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3c89544f08b3493a868364ef7726d992", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57168d0a-d8", "ovs_interfaceid": "57168d0a-d87f-4584-b9e8-be6837c29d51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1536.870587] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Releasing lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1536.870864] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance network_info: |[{"id": "57168d0a-d87f-4584-b9e8-be6837c29d51", "address": "fa:16:3e:b5:d2:56", "network": {"id": "f5355e81-1889-404d-8808-a8cbc8245e73", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-607972824-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3c89544f08b3493a868364ef7726d992", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57168d0a-d8", "ovs_interfaceid": "57168d0a-d87f-4584-b9e8-be6837c29d51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1536.871276] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:b5:d2:56', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2d88bb07-f93c-45ca-bce7-230cb1f33833', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '57168d0a-d87f-4584-b9e8-be6837c29d51', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1536.878590] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Creating folder: Project (3c89544f08b3493a868364ef7726d992). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1536.879119] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-7547072d-692e-4ba2-8fc3-14e8d7c7d440 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.889475] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Created folder: Project (3c89544f08b3493a868364ef7726d992) in parent group-v692787. [ 1536.889778] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Creating folder: Instances. Parent ref: group-v692876. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1536.890084] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-a10f8309-dba7-4e75-874e-253964aa1493 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.899021] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Created folder: Instances in parent group-v692876. [ 1536.899021] env[68571]: DEBUG oslo.service.loopingcall [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1536.899021] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1536.899241] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-664e1d6c-d10a-4623-ac02-a65929b9f57d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1536.917700] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1536.917700] env[68571]: value = "task-3467724" [ 1536.917700] env[68571]: _type = "Task" [ 1536.917700] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1536.925118] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467724, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1537.289359] env[68571]: DEBUG nova.compute.manager [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Received event network-vif-plugged-57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1537.289944] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Acquiring lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1537.290202] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1537.290393] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1537.290566] env[68571]: DEBUG nova.compute.manager [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] No waiting events found dispatching network-vif-plugged-57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1537.290730] env[68571]: WARNING nova.compute.manager [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Received unexpected event network-vif-plugged-57168d0a-d87f-4584-b9e8-be6837c29d51 for instance with vm_state building and task_state spawning. [ 1537.290888] env[68571]: DEBUG nova.compute.manager [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Received event network-changed-57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1537.291056] env[68571]: DEBUG nova.compute.manager [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Refreshing instance network info cache due to event network-changed-57168d0a-d87f-4584-b9e8-be6837c29d51. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1537.291271] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Acquiring lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1537.291413] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Acquired lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1537.291569] env[68571]: DEBUG nova.network.neutron [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Refreshing network info cache for port 57168d0a-d87f-4584-b9e8-be6837c29d51 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1537.427205] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467724, 'name': CreateVM_Task, 'duration_secs': 0.28931} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1537.427382] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1537.428097] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1537.428294] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1537.428607] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1537.428864] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-9cd235f7-fc2a-4d44-814d-07b6288dbdc3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1537.435791] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for the task: (returnval){ [ 1537.435791] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b4123f-84c3-7efc-426e-e10a4e88b30f" [ 1537.435791] env[68571]: _type = "Task" [ 1537.435791] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1537.445677] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b4123f-84c3-7efc-426e-e10a4e88b30f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1537.517120] env[68571]: DEBUG nova.network.neutron [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Updated VIF entry in instance network info cache for port 57168d0a-d87f-4584-b9e8-be6837c29d51. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1537.517536] env[68571]: DEBUG nova.network.neutron [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Updating instance_info_cache with network_info: [{"id": "57168d0a-d87f-4584-b9e8-be6837c29d51", "address": "fa:16:3e:b5:d2:56", "network": {"id": "f5355e81-1889-404d-8808-a8cbc8245e73", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-607972824-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "3c89544f08b3493a868364ef7726d992", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2d88bb07-f93c-45ca-bce7-230cb1f33833", "external-id": "nsx-vlan-transportzone-387", "segmentation_id": 387, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap57168d0a-d8", "ovs_interfaceid": "57168d0a-d87f-4584-b9e8-be6837c29d51", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1537.526604] env[68571]: DEBUG oslo_concurrency.lockutils [req-09ddd24b-1ea0-41e7-8055-3fb7411b3ecb req-1ee954ca-b72c-44d9-9310-101bea0d64f2 service nova] Releasing lock "refresh_cache-1f8dd053-ebd8-4ad9-a607-ab364a3320ca" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1537.946796] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1537.947143] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1537.947452] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1542.490269] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1542.490269] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1542.502556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1542.502556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1542.502556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1542.502556] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1542.503883] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30711abe-53e8-4dfb-a4fc-ff725b7e8b5f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.512344] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-568d1c10-9fe8-4804-98a3-96b7f925dbb3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.526543] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2dd9fdb-015c-4ee6-9a43-b9e3406dd510 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.532904] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a5bfb37d-2ccf-4495-b24b-c5125d96979b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.561684] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1542.561967] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1542.562072] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1542.638916] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance afe033a3-4e04-4249-beed-169a3e40a721 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639087] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639221] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639348] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639475] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639587] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639703] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639817] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.639930] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.640051] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1542.653521] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 780d6657-20dc-4d8c-acec-0e002f79372b has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1542.663970] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47511138-2486-46a8-85d5-081388bb0b16 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1542.673194] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1542.673426] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1542.673593] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1542.824059] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9701a885-b828-415c-9ca0-99862f64630f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.831429] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09898789-b3e7-41e8-bae8-894c861f287d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.860696] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11b9825e-a8f8-421e-be71-5c6e9768ee87 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.867565] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b50fc33f-843c-4515-b30a-c936f75e5b64 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1542.881129] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1542.890073] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1542.904898] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1542.905155] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.343s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1543.906938] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1543.906938] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1544.484591] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1547.489977] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1547.490272] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1547.490322] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1547.521884] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.522207] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.522410] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.522594] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.522784] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.523018] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.523297] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.523297] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.523460] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.523626] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1547.524455] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1548.488716] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.488956] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1548.489123] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1549.516822] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1549.517166] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1550.490373] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1552.267999] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "9e8c8d14-144f-42e3-8556-796651b7b04f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1564.704947] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1573.163428] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "7fd03349-420c-4076-959c-31562e95098d" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1573.164347] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1580.948172] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1580.948540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1582.315159] env[68571]: WARNING oslo_vmware.rw_handles [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1582.315159] env[68571]: ERROR oslo_vmware.rw_handles [ 1582.315866] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1582.317455] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1582.317727] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Copying Virtual Disk [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/eada021e-290f-45f3-b65a-ab0880f0c7cf/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1582.318029] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-021dc727-9767-4bec-90c2-b3071d2c1142 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.327119] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1582.327119] env[68571]: value = "task-3467725" [ 1582.327119] env[68571]: _type = "Task" [ 1582.327119] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1582.334917] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467725, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1582.837321] env[68571]: DEBUG oslo_vmware.exceptions [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1582.837580] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1582.838153] env[68571]: ERROR nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1582.838153] env[68571]: Faults: ['InvalidArgument'] [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] Traceback (most recent call last): [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] yield resources [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self.driver.spawn(context, instance, image_meta, [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self._fetch_image_if_missing(context, vi) [ 1582.838153] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] image_cache(vi, tmp_image_ds_loc) [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] vm_util.copy_virtual_disk( [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] session._wait_for_task(vmdk_copy_task) [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return self.wait_for_task(task_ref) [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return evt.wait() [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] result = hub.switch() [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1582.838575] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return self.greenlet.switch() [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self.f(*self.args, **self.kw) [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] raise exceptions.translate_fault(task_info.error) [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] Faults: ['InvalidArgument'] [ 1582.838985] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] [ 1582.838985] env[68571]: INFO nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Terminating instance [ 1582.840008] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1582.840218] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1582.840815] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1582.841017] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1582.841267] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b8e1c563-5c70-4514-865a-2f84bfcf27be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.843624] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-382e3c58-fc5e-4fef-94c2-50ce113891f5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.849856] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1582.850048] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-e68bde30-1a0b-4384-9b8d-19e93c0de0e7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.852028] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1582.852205] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1582.853127] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-425aef4b-bd14-4e79-a064-7ef61267cf94 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.857754] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for the task: (returnval){ [ 1582.857754] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]526d49e6-9ac4-4619-bfe4-94076921cf2a" [ 1582.857754] env[68571]: _type = "Task" [ 1582.857754] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1582.864572] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]526d49e6-9ac4-4619-bfe4-94076921cf2a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1582.919453] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1582.919453] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1582.919453] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleting the datastore file [datastore1] afe033a3-4e04-4249-beed-169a3e40a721 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1582.919453] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-9cae3f22-da00-452a-aa13-ae557dde1bbb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1582.925311] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1582.925311] env[68571]: value = "task-3467727" [ 1582.925311] env[68571]: _type = "Task" [ 1582.925311] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1582.936355] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467727, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1583.367380] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1583.367677] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Creating directory with path [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1583.367896] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7a08108a-4657-41d0-b5e9-4677e0faa8ca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.379309] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Created directory with path [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1583.379499] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Fetch image to [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1583.379661] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1583.380393] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-247de8d1-1eec-432d-a817-1625c3ec4ecc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.386878] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b31a7e8-da37-410e-b1e3-a76e8a358138 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.395564] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30afe7e3-64e4-4449-b5fb-cb89763979a1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.424854] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc3459b3-d7fd-4d95-9c8c-08d5bfc2802d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.435748] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-b075c752-eba7-4825-b795-1b1e3834bbc9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.437393] env[68571]: DEBUG oslo_vmware.api [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467727, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.07587} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1583.437659] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1583.437840] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1583.438188] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1583.438262] env[68571]: INFO nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1583.440531] env[68571]: DEBUG nova.compute.claims [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1583.440721] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1583.440938] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1583.458797] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1583.515370] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1583.574499] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1583.574674] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1583.702431] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b50bc47-b71a-43f6-8e1c-83870a5626af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.709685] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdbefc4c-28e6-450c-8cd6-4d87ddcc7fe9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.739530] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3691a2c-383a-4503-9dab-20df1fa6a071 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.747195] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-83d16f78-a8f8-4432-91d9-8e7c395c98ab {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1583.760585] env[68571]: DEBUG nova.compute.provider_tree [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1583.769218] env[68571]: DEBUG nova.scheduler.client.report [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1583.782683] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.342s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1583.783197] env[68571]: ERROR nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1583.783197] env[68571]: Faults: ['InvalidArgument'] [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] Traceback (most recent call last): [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self.driver.spawn(context, instance, image_meta, [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self._fetch_image_if_missing(context, vi) [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] image_cache(vi, tmp_image_ds_loc) [ 1583.783197] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] vm_util.copy_virtual_disk( [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] session._wait_for_task(vmdk_copy_task) [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return self.wait_for_task(task_ref) [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return evt.wait() [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] result = hub.switch() [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] return self.greenlet.switch() [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1583.783537] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] self.f(*self.args, **self.kw) [ 1583.783907] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1583.783907] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] raise exceptions.translate_fault(task_info.error) [ 1583.783907] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1583.783907] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] Faults: ['InvalidArgument'] [ 1583.783907] env[68571]: ERROR nova.compute.manager [instance: afe033a3-4e04-4249-beed-169a3e40a721] [ 1583.783907] env[68571]: DEBUG nova.compute.utils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1583.785203] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Build of instance afe033a3-4e04-4249-beed-169a3e40a721 was re-scheduled: A specified parameter was not correct: fileType [ 1583.785203] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1583.785568] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1583.785743] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1583.785913] env[68571]: DEBUG nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1583.786094] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1584.099027] env[68571]: DEBUG nova.network.neutron [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1584.111542] env[68571]: INFO nova.compute.manager [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Took 0.33 seconds to deallocate network for instance. [ 1584.198640] env[68571]: INFO nova.scheduler.client.report [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted allocations for instance afe033a3-4e04-4249-beed-169a3e40a721 [ 1584.220772] env[68571]: DEBUG oslo_concurrency.lockutils [None req-c4f05374-190d-4341-a260-98a7fbec8cc0 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.748s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.221780] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.362s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1584.222078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "afe033a3-4e04-4249-beed-169a3e40a721-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1584.222325] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1584.222517] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.224689] env[68571]: INFO nova.compute.manager [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Terminating instance [ 1584.226562] env[68571]: DEBUG nova.compute.manager [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1584.226816] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1584.227686] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a2d4150a-8a20-4293-835e-96194005762a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.238662] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1776c3a3-fbf9-4149-999b-a841812210ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.249895] env[68571]: DEBUG nova.compute.manager [None req-1f7f3fcc-d403-4056-9967-103f75a9aec9 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 780d6657-20dc-4d8c-acec-0e002f79372b] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1584.270297] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance afe033a3-4e04-4249-beed-169a3e40a721 could not be found. [ 1584.270501] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1584.270679] env[68571]: INFO nova.compute.manager [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1584.270922] env[68571]: DEBUG oslo.service.loopingcall [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1584.271173] env[68571]: DEBUG nova.compute.manager [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1584.271274] env[68571]: DEBUG nova.network.neutron [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1584.273993] env[68571]: DEBUG nova.compute.manager [None req-1f7f3fcc-d403-4056-9967-103f75a9aec9 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 780d6657-20dc-4d8c-acec-0e002f79372b] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1584.294446] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1f7f3fcc-d403-4056-9967-103f75a9aec9 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "780d6657-20dc-4d8c-acec-0e002f79372b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 208.543s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.298667] env[68571]: DEBUG nova.network.neutron [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1584.304912] env[68571]: DEBUG nova.compute.manager [None req-2cd6c9cb-67c2-48bc-bc05-f367eae44daf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 47511138-2486-46a8-85d5-081388bb0b16] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1584.308295] env[68571]: INFO nova.compute.manager [-] [instance: afe033a3-4e04-4249-beed-169a3e40a721] Took 0.04 seconds to deallocate network for instance. [ 1584.329910] env[68571]: DEBUG nova.compute.manager [None req-2cd6c9cb-67c2-48bc-bc05-f367eae44daf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 47511138-2486-46a8-85d5-081388bb0b16] Instance disappeared before build. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2413}} [ 1584.359494] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2cd6c9cb-67c2-48bc-bc05-f367eae44daf tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "47511138-2486-46a8-85d5-081388bb0b16" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 200.547s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.374440] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1584.430952] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b556daf8-df4d-475b-a3e9-a01ce9a72292 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "afe033a3-4e04-4249-beed-169a3e40a721" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.209s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.439061] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1584.439319] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1584.440813] env[68571]: INFO nova.compute.claims [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1584.641292] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4afad1f3-8abe-4bdc-a91e-bc20b84a08ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.648594] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c6fb6242-1976-4e38-bca6-f48cc296b07e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.678793] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-436cf72d-2711-4723-8ce5-eacedcccd3a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.685626] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2af894ee-01e9-4563-89f5-23d91c89c025 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.698331] env[68571]: DEBUG nova.compute.provider_tree [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1584.708018] env[68571]: DEBUG nova.scheduler.client.report [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1584.726810] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.287s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1584.727306] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1584.761009] env[68571]: DEBUG nova.compute.utils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1584.765355] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1584.765355] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1584.771950] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1584.822244] env[68571]: DEBUG nova.policy [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '75464428d107469f99f4308cfdb6b2df', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '506bd7cf3d9c4c54aabe7ef0be376fe9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1584.841170] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1584.869053] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1584.869053] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1584.869053] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1584.869272] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1584.869272] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1584.869272] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1584.870021] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1584.870021] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1584.870021] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1584.870021] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1584.870238] env[68571]: DEBUG nova.virt.hardware [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1584.871231] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a2bd02a-5539-4067-8197-fe6ca7415619 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1584.879096] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6affb2d5-eb0b-4dd0-8731-38442ae53200 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1585.134273] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Successfully created port: d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1585.767824] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Successfully updated port: d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1585.786219] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1585.786379] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1585.786577] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1585.837403] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1586.015533] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Updating instance_info_cache with network_info: [{"id": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "address": "fa:16:3e:c5:8c:33", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd36fb884-1c", "ovs_interfaceid": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1586.032701] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1586.033029] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance network_info: |[{"id": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "address": "fa:16:3e:c5:8c:33", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd36fb884-1c", "ovs_interfaceid": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1586.033426] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:c5:8c:33', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '50886eea-591a-452c-a27b-5f22cfc9df85', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'd36fb884-1c76-4c7c-b3db-5f253cc4eb36', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1586.041503] env[68571]: DEBUG oslo.service.loopingcall [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1586.041879] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1586.042143] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-ba292ecc-5626-4356-86dd-d6b08ecdfdb2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.063047] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1586.063047] env[68571]: value = "task-3467728" [ 1586.063047] env[68571]: _type = "Task" [ 1586.063047] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1586.070759] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467728, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1586.175712] env[68571]: DEBUG nova.compute.manager [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Received event network-vif-plugged-d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1586.175850] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Acquiring lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1586.176069] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1586.176240] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1586.176404] env[68571]: DEBUG nova.compute.manager [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] No waiting events found dispatching network-vif-plugged-d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1586.176568] env[68571]: WARNING nova.compute.manager [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Received unexpected event network-vif-plugged-d36fb884-1c76-4c7c-b3db-5f253cc4eb36 for instance with vm_state building and task_state spawning. [ 1586.176730] env[68571]: DEBUG nova.compute.manager [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Received event network-changed-d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1586.176883] env[68571]: DEBUG nova.compute.manager [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Refreshing instance network info cache due to event network-changed-d36fb884-1c76-4c7c-b3db-5f253cc4eb36. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1586.177073] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Acquiring lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1586.177211] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Acquired lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1586.177360] env[68571]: DEBUG nova.network.neutron [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Refreshing network info cache for port d36fb884-1c76-4c7c-b3db-5f253cc4eb36 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1586.419253] env[68571]: DEBUG nova.network.neutron [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Updated VIF entry in instance network info cache for port d36fb884-1c76-4c7c-b3db-5f253cc4eb36. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1586.419646] env[68571]: DEBUG nova.network.neutron [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Updating instance_info_cache with network_info: [{"id": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "address": "fa:16:3e:c5:8c:33", "network": {"id": "20ed8763-0c02-410b-9f5d-cb667bfdaa58", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-1635789088-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "506bd7cf3d9c4c54aabe7ef0be376fe9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "50886eea-591a-452c-a27b-5f22cfc9df85", "external-id": "nsx-vlan-transportzone-578", "segmentation_id": 578, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapd36fb884-1c", "ovs_interfaceid": "d36fb884-1c76-4c7c-b3db-5f253cc4eb36", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1586.430594] env[68571]: DEBUG oslo_concurrency.lockutils [req-3016f27e-3f3c-41f6-ad34-6b9a5b3af9ed req-05dc2ab6-1da1-43bb-ae37-cd66f229f80b service nova] Releasing lock "refresh_cache-4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1586.572654] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467728, 'name': CreateVM_Task, 'duration_secs': 0.318351} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1586.572832] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1586.573452] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1586.573650] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1586.574040] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1586.574306] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-17bf476f-f310-4a61-915b-9a863752ccdc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1586.578606] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1586.578606] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e73e70-ddb2-75d4-a1a4-993d7ae414bd" [ 1586.578606] env[68571]: _type = "Task" [ 1586.578606] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1586.585925] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e73e70-ddb2-75d4-a1a4-993d7ae414bd, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1587.090093] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1587.090093] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1587.090093] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1602.490693] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1603.489952] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1603.490258] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1603.501938] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1603.502255] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.502335] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1603.502463] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1603.503529] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-069657ec-7a20-4419-a169-64a418fcae9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.512180] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-10591877-f819-42a9-bae1-65f3e65a5c37 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.525936] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-debbb566-fcfd-44b8-957a-85e7f764c0a8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.531975] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f25a3cf1-bcbd-47bc-881a-6b4c0d958051 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.561069] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180908MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1603.561269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1603.561473] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1603.633626] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.633792] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.633925] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634061] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634186] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634306] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634424] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634541] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634657] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.634771] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1603.645065] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1603.654886] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1603.664208] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1603.664424] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1603.664570] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1603.800452] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-698f07d2-c142-4a22-a81e-a9809f6e459c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.807955] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3297e3bb-ad3a-4214-a51b-5ba30a443851 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.836629] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d9b2bb1-3f11-48ea-be15-dff550326e74 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.843579] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d91fd030-15f9-4b03-833f-ee886760f89b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1603.857371] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1603.865382] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1603.878448] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1603.878630] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.317s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1604.873533] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1605.489088] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1605.680317] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1605.680532] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1608.489456] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1608.489745] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1608.489745] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1608.512431] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.512597] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.512728] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.512852] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.512977] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513112] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513233] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513351] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513466] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513582] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1608.513698] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1608.514255] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.488898] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1610.489285] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1611.490353] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1617.485516] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1629.382858] env[68571]: WARNING oslo_vmware.rw_handles [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1629.382858] env[68571]: ERROR oslo_vmware.rw_handles [ 1629.383921] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1629.385356] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1629.385629] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Copying Virtual Disk [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/4346c087-29f7-4033-a9e7-b3027e9f06e9/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1629.385969] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-b6a83799-7185-4fa8-b517-d3f71c1abdfd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.394874] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for the task: (returnval){ [ 1629.394874] env[68571]: value = "task-3467729" [ 1629.394874] env[68571]: _type = "Task" [ 1629.394874] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1629.402414] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Task: {'id': task-3467729, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1629.905812] env[68571]: DEBUG oslo_vmware.exceptions [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1629.906078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1629.906674] env[68571]: ERROR nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1629.906674] env[68571]: Faults: ['InvalidArgument'] [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Traceback (most recent call last): [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] yield resources [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.driver.spawn(context, instance, image_meta, [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._fetch_image_if_missing(context, vi) [ 1629.906674] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] image_cache(vi, tmp_image_ds_loc) [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] vm_util.copy_virtual_disk( [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] session._wait_for_task(vmdk_copy_task) [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.wait_for_task(task_ref) [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return evt.wait() [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] result = hub.switch() [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1629.907317] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.greenlet.switch() [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.f(*self.args, **self.kw) [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] raise exceptions.translate_fault(task_info.error) [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Faults: ['InvalidArgument'] [ 1629.907658] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] [ 1629.907658] env[68571]: INFO nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Terminating instance [ 1629.908499] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1629.908703] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1629.908933] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-4674ea82-b74e-4552-93fa-1705a594b178 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.911131] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1629.911231] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1629.911369] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1629.917855] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1629.918071] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1629.919226] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-82fcc48a-b7d8-49d0-978d-6fe6af8af187 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1629.926290] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for the task: (returnval){ [ 1629.926290] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d3cbff-a5ab-d786-0f41-44cde4252108" [ 1629.926290] env[68571]: _type = "Task" [ 1629.926290] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1629.933579] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d3cbff-a5ab-d786-0f41-44cde4252108, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1629.940762] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1630.001104] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1630.009594] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Releasing lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1630.009970] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1630.010195] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1630.011259] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fa345fb4-c1ce-4cde-bd28-cd28ea0696f6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.018858] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1630.019074] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-33534af3-a1f5-49dd-b1fd-86a922c0b814 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.050029] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1630.050029] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1630.050029] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Deleting the datastore file [datastore1] f5328efa-b3e0-48b2-8f13-9715e46cb017 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1630.050029] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-fb0ee2e6-0420-491a-82da-7513cf544c09 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.055253] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for the task: (returnval){ [ 1630.055253] env[68571]: value = "task-3467731" [ 1630.055253] env[68571]: _type = "Task" [ 1630.055253] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1630.063815] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Task: {'id': task-3467731, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1630.437071] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1630.437415] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Creating directory with path [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1630.437453] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-daa83fa0-4e52-4df2-a679-7641a776fba5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.448541] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Created directory with path [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1630.448720] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Fetch image to [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1630.448884] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1630.449634] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d923051c-a064-41d2-a6ac-175c2229ad54 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.455903] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61a915dc-a5d8-44c7-ba02-32bb170168d1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.464501] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ff07128b-d594-45e6-b746-23af885f61ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.494502] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fc362f2-8a42-41a6-be3e-6367f4c65fb7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.499735] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d7a875fa-4a21-43b8-b079-3e132d0d3339 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.518197] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1630.563196] env[68571]: DEBUG oslo_vmware.api [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Task: {'id': task-3467731, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.041592} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1630.563445] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1630.563631] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1630.563803] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1630.563976] env[68571]: INFO nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Took 0.55 seconds to destroy the instance on the hypervisor. [ 1630.564297] env[68571]: DEBUG oslo.service.loopingcall [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1630.564539] env[68571]: DEBUG nova.compute.manager [-] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1630.566734] env[68571]: DEBUG oslo_vmware.rw_handles [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1630.568086] env[68571]: DEBUG nova.compute.claims [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1630.568266] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1630.568480] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1630.629900] env[68571]: DEBUG oslo_vmware.rw_handles [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1630.629996] env[68571]: DEBUG oslo_vmware.rw_handles [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1630.794524] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1955c7f-18da-41c7-a848-de8b94b595f6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.801905] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44f61412-c619-4d47-ade7-48414383dbc7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.832245] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eece46c8-9e5a-4f1d-8579-590f71cfc5ed {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.839285] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e0ef0c1-9f75-462f-b394-0d43f9dad53f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1630.852063] env[68571]: DEBUG nova.compute.provider_tree [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1630.860428] env[68571]: DEBUG nova.scheduler.client.report [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1630.874920] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.306s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1630.875453] env[68571]: ERROR nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1630.875453] env[68571]: Faults: ['InvalidArgument'] [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Traceback (most recent call last): [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.driver.spawn(context, instance, image_meta, [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._fetch_image_if_missing(context, vi) [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] image_cache(vi, tmp_image_ds_loc) [ 1630.875453] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] vm_util.copy_virtual_disk( [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] session._wait_for_task(vmdk_copy_task) [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.wait_for_task(task_ref) [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return evt.wait() [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] result = hub.switch() [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.greenlet.switch() [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1630.875837] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.f(*self.args, **self.kw) [ 1630.876199] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1630.876199] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] raise exceptions.translate_fault(task_info.error) [ 1630.876199] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1630.876199] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Faults: ['InvalidArgument'] [ 1630.876199] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] [ 1630.876199] env[68571]: DEBUG nova.compute.utils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1630.877661] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Build of instance f5328efa-b3e0-48b2-8f13-9715e46cb017 was re-scheduled: A specified parameter was not correct: fileType [ 1630.877661] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1630.878088] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1630.878321] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1630.878474] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1630.878632] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1630.902105] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1630.959826] env[68571]: DEBUG nova.network.neutron [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1630.968893] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Releasing lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1630.969132] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1630.969325] env[68571]: DEBUG nova.compute.manager [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 1631.055233] env[68571]: INFO nova.scheduler.client.report [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Deleted allocations for instance f5328efa-b3e0-48b2-8f13-9715e46cb017 [ 1631.076519] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5651f19e-ffd1-4dd7-952e-ba245149b6b6 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 629.140s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.077779] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 431.662s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.078058] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "f5328efa-b3e0-48b2-8f13-9715e46cb017-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.078467] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.078566] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.080467] env[68571]: INFO nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Terminating instance [ 1631.081992] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquiring lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1631.082272] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Acquired lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1631.082360] env[68571]: DEBUG nova.network.neutron [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1631.090420] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1631.109322] env[68571]: DEBUG nova.network.neutron [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1631.142178] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1631.142432] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1631.144363] env[68571]: INFO nova.compute.claims [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1631.168815] env[68571]: DEBUG nova.network.neutron [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1631.180710] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Releasing lock "refresh_cache-f5328efa-b3e0-48b2-8f13-9715e46cb017" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1631.181056] env[68571]: DEBUG nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1631.181259] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1631.183070] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-ad65be3e-3df4-4d67-b4d2-a9555153fcdf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.194121] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3a3a803c-1d34-4c64-aacd-aa06c5e2e938 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.226055] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f5328efa-b3e0-48b2-8f13-9715e46cb017 could not be found. [ 1631.226420] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1631.226509] env[68571]: INFO nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1631.226695] env[68571]: DEBUG oslo.service.loopingcall [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1631.229483] env[68571]: DEBUG nova.compute.manager [-] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1631.229653] env[68571]: DEBUG nova.network.neutron [-] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1631.387457] env[68571]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1631.388081] env[68571]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-0a6d9fa3-b28b-4f4a-8fdb-3a4435c268ee'] [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1631.388360] env[68571]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1631.388879] env[68571]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.389437] env[68571]: ERROR oslo.service.loopingcall [ 1631.389843] env[68571]: ERROR nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.417778] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64f9cd83-d9e7-4a30-92dd-054c5a7df1ad {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.423593] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-efb02e5c-2097-4589-a900-ae51083031fc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.428576] env[68571]: ERROR nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Traceback (most recent call last): [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] exception_handler_v20(status_code, error_body) [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] raise client_exc(message=error_message, [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Neutron server returns request_ids: ['req-0a6d9fa3-b28b-4f4a-8fdb-3a4435c268ee'] [ 1631.428576] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] During handling of the above exception, another exception occurred: [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Traceback (most recent call last): [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._delete_instance(context, instance, bdms) [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._shutdown_instance(context, instance, bdms) [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._try_deallocate_network(context, instance, requested_networks) [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] with excutils.save_and_reraise_exception(): [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.428949] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.force_reraise() [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] raise self.value [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] _deallocate_network_with_retries() [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return evt.wait() [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] result = hub.switch() [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.greenlet.switch() [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1631.429354] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] result = func(*self.args, **self.kw) [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] result = f(*args, **kwargs) [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._deallocate_network( [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self.network_api.deallocate_for_instance( [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] data = neutron.list_ports(**search_opts) [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.list('ports', self.ports_path, retrieve_all, [ 1631.429651] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] for r in self._pagination(collection, path, **params): [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] res = self.get(path, params=params) [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.retry_request("GET", action, body=body, [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1631.430022] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] return self.do_request(method, action, body=body, [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] ret = obj(*args, **kwargs) [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] self._handle_fault_response(status_code, replybody, resp) [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.430343] env[68571]: ERROR nova.compute.manager [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] [ 1631.459113] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e48e06a7-2da0-475c-a94c-9823d58ea1d7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.465859] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5d89594a-415b-463d-99e6-84d4c27d4bf7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.480571] env[68571]: DEBUG nova.compute.provider_tree [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1631.482791] env[68571]: DEBUG oslo_concurrency.lockutils [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Lock "f5328efa-b3e0-48b2-8f13-9715e46cb017" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.405s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.488577] env[68571]: DEBUG nova.scheduler.client.report [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1631.501236] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.359s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1631.501713] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1631.531680] env[68571]: INFO nova.compute.manager [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] [instance: f5328efa-b3e0-48b2-8f13-9715e46cb017] Successfully reverted task state from None on failure for instance. [ 1631.535151] env[68571]: DEBUG nova.compute.utils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server [None req-7bc7e02a-e021-4b38-8708-16a9402c35a7 tempest-ServerShowV247Test-923331119 tempest-ServerShowV247Test-923331119-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-0a6d9fa3-b28b-4f4a-8fdb-3a4435c268ee'] [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1631.537094] env[68571]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.537637] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1631.538182] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1631.538704] env[68571]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.539208] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1631.539727] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1631.540215] env[68571]: ERROR oslo_messaging.rpc.server [ 1631.540215] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1631.540215] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1631.544390] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1631.603534] env[68571]: DEBUG nova.policy [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5e3b3a596663400b8c5139e012387923', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e9a5e8481daf4571a59da04c7603b699', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1631.606939] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1631.632218] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1631.632462] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1631.632621] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1631.632802] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1631.632946] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1631.633107] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1631.633316] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1631.633488] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1631.633633] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1631.633792] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1631.633962] env[68571]: DEBUG nova.virt.hardware [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1631.634828] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-46a522ea-66de-487e-b6ae-f05705757bef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.644316] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b39aaf88-4c33-4564-94df-085111f38706 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1631.941267] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Successfully created port: 792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1632.525664] env[68571]: DEBUG nova.compute.manager [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Received event network-vif-plugged-792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1632.525957] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] Acquiring lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1632.526110] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1632.526286] env[68571]: DEBUG oslo_concurrency.lockutils [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1632.526457] env[68571]: DEBUG nova.compute.manager [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] No waiting events found dispatching network-vif-plugged-792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1632.526622] env[68571]: WARNING nova.compute.manager [req-3e804f9e-fd21-4f02-8e50-e288d5a99dc3 req-98894659-e73e-45db-a940-c87a0c478045 service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Received unexpected event network-vif-plugged-792b3557-6600-4a11-9a32-d6b228281170 for instance with vm_state building and task_state spawning. [ 1632.708932] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Successfully updated port: 792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1632.729570] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1632.729765] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquired lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1632.729958] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1632.764378] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1632.923462] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Updating instance_info_cache with network_info: [{"id": "792b3557-6600-4a11-9a32-d6b228281170", "address": "fa:16:3e:de:45:bc", "network": {"id": "f85d7976-c11f-4960-823b-6247b39b5626", "bridge": "br-int", "label": "tempest-ServersTestJSON-1166957619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e9a5e8481daf4571a59da04c7603b699", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d377d75-3add-4a15-8691-74b2eb010924", "external-id": "nsx-vlan-transportzone-71", "segmentation_id": 71, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap792b3557-66", "ovs_interfaceid": "792b3557-6600-4a11-9a32-d6b228281170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1632.940531] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Releasing lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1632.940843] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance network_info: |[{"id": "792b3557-6600-4a11-9a32-d6b228281170", "address": "fa:16:3e:de:45:bc", "network": {"id": "f85d7976-c11f-4960-823b-6247b39b5626", "bridge": "br-int", "label": "tempest-ServersTestJSON-1166957619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e9a5e8481daf4571a59da04c7603b699", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d377d75-3add-4a15-8691-74b2eb010924", "external-id": "nsx-vlan-transportzone-71", "segmentation_id": 71, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap792b3557-66", "ovs_interfaceid": "792b3557-6600-4a11-9a32-d6b228281170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1632.941295] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:de:45:bc', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '7d377d75-3add-4a15-8691-74b2eb010924', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '792b3557-6600-4a11-9a32-d6b228281170', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1632.948838] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Creating folder: Project (e9a5e8481daf4571a59da04c7603b699). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1632.949467] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c39fb4ea-1bcf-401e-ba10-302fdf613ee0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.959616] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Created folder: Project (e9a5e8481daf4571a59da04c7603b699) in parent group-v692787. [ 1632.959616] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Creating folder: Instances. Parent ref: group-v692880. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1632.959616] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c3c42794-3c60-49a5-b9b1-038078ec00e0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.969016] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Created folder: Instances in parent group-v692880. [ 1632.969253] env[68571]: DEBUG oslo.service.loopingcall [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1632.969432] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1632.969624] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-a5d5e00c-5e88-4ec6-9eb8-97835a9b0928 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1632.987535] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1632.987535] env[68571]: value = "task-3467734" [ 1632.987535] env[68571]: _type = "Task" [ 1632.987535] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1632.996753] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467734, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1633.497016] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467734, 'name': CreateVM_Task, 'duration_secs': 0.277856} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1633.497159] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1633.497749] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1633.497925] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1633.498289] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1633.498551] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-63481391-c0dc-4027-9f4f-477af239d071 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1633.502841] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for the task: (returnval){ [ 1633.502841] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52fed10f-77d1-f45b-2258-761d276e7646" [ 1633.502841] env[68571]: _type = "Task" [ 1633.502841] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1633.511636] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52fed10f-77d1-f45b-2258-761d276e7646, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1634.013007] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1634.013363] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1634.013594] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1634.740506] env[68571]: DEBUG nova.compute.manager [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Received event network-changed-792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1634.740730] env[68571]: DEBUG nova.compute.manager [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Refreshing instance network info cache due to event network-changed-792b3557-6600-4a11-9a32-d6b228281170. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1634.740938] env[68571]: DEBUG oslo_concurrency.lockutils [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] Acquiring lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1634.741171] env[68571]: DEBUG oslo_concurrency.lockutils [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] Acquired lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1634.741255] env[68571]: DEBUG nova.network.neutron [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Refreshing network info cache for port 792b3557-6600-4a11-9a32-d6b228281170 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1634.981750] env[68571]: DEBUG nova.network.neutron [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Updated VIF entry in instance network info cache for port 792b3557-6600-4a11-9a32-d6b228281170. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1634.982144] env[68571]: DEBUG nova.network.neutron [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Updating instance_info_cache with network_info: [{"id": "792b3557-6600-4a11-9a32-d6b228281170", "address": "fa:16:3e:de:45:bc", "network": {"id": "f85d7976-c11f-4960-823b-6247b39b5626", "bridge": "br-int", "label": "tempest-ServersTestJSON-1166957619-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e9a5e8481daf4571a59da04c7603b699", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "7d377d75-3add-4a15-8691-74b2eb010924", "external-id": "nsx-vlan-transportzone-71", "segmentation_id": 71, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap792b3557-66", "ovs_interfaceid": "792b3557-6600-4a11-9a32-d6b228281170", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1634.990666] env[68571]: DEBUG oslo_concurrency.lockutils [req-00c7a639-88a9-4994-8881-45007562a9d4 req-d2effa2f-3757-47b8-8f2f-48dc1ed15e8e service nova] Releasing lock "refresh_cache-8506e00f-2b77-4fa1-804a-8e548b78ee7d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1663.489441] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1664.489605] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.484906] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.488608] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.488808] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1665.500284] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1665.500622] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1665.500693] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1665.500794] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1665.501887] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fd4fbf2-b310-4c3a-902c-c34b18270b9e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.510603] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e4fb5aeb-68bc-4d6e-8cc2-fd80de6bbcdc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.524363] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f41437f0-685e-4740-ab0c-cd0b8558b143 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.530444] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4d56f84d-2544-4162-896b-a5bcc8605f7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.560114] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180909MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1665.560270] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1665.560638] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1665.635815] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.635815] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.635815] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636025] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636025] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636134] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636201] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636317] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636473] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.636603] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1665.646922] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1665.657686] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1665.668161] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1665.668401] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1665.668557] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1665.825092] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-58ed5d6e-7f3a-4be1-96a0-82b1b20d3dc6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.833424] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e85109e4-2231-4751-99f4-832589489aa1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.862302] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ee43a9a-bcc8-43cc-b1ac-a424040493c6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.869075] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5be87286-fbe9-4a33-bac9-91f6ccbf07ee {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1665.881607] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1665.890280] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1665.903220] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1665.903404] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.343s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1668.904966] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.490133] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1669.490335] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1669.490434] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1669.509713] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.509910] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510016] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510149] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510272] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510392] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510511] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510628] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510765] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510857] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1669.510968] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1672.489993] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1672.490316] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1673.490610] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1676.745845] env[68571]: WARNING oslo_vmware.rw_handles [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1676.745845] env[68571]: ERROR oslo_vmware.rw_handles [ 1676.746648] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1676.748151] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1676.748392] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Copying Virtual Disk [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/0d6c4f65-ef6c-44b3-9270-09a51062e40e/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1676.748700] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-ba0e2462-893c-4528-84ef-68529a1e7df5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1676.758279] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for the task: (returnval){ [ 1676.758279] env[68571]: value = "task-3467735" [ 1676.758279] env[68571]: _type = "Task" [ 1676.758279] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1676.765741] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Task: {'id': task-3467735, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1677.268418] env[68571]: DEBUG oslo_vmware.exceptions [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1677.268700] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1677.269201] env[68571]: ERROR nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1677.269201] env[68571]: Faults: ['InvalidArgument'] [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Traceback (most recent call last): [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] yield resources [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self.driver.spawn(context, instance, image_meta, [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self._fetch_image_if_missing(context, vi) [ 1677.269201] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] image_cache(vi, tmp_image_ds_loc) [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] vm_util.copy_virtual_disk( [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] session._wait_for_task(vmdk_copy_task) [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return self.wait_for_task(task_ref) [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return evt.wait() [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] result = hub.switch() [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1677.269720] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return self.greenlet.switch() [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self.f(*self.args, **self.kw) [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] raise exceptions.translate_fault(task_info.error) [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Faults: ['InvalidArgument'] [ 1677.270119] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] [ 1677.270119] env[68571]: INFO nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Terminating instance [ 1677.271025] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1677.271236] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1677.271469] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a2eea12b-1d16-4097-9cd8-eb4e680159f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.273694] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1677.273880] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1677.274577] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-805b55e3-23f6-4f9f-bac1-0dfb417280e8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.281105] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1677.281306] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-ac3de32f-c45f-4319-9bbb-25714c19021d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.283303] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1677.283474] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1677.284408] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-15d70bd4-edd3-48de-a9a3-260a15829562 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.290865] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1677.290865] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ba1607-50fe-264a-8596-310dfd80c73a" [ 1677.290865] env[68571]: _type = "Task" [ 1677.290865] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1677.298460] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52ba1607-50fe-264a-8596-310dfd80c73a, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1677.802099] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1677.802099] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1677.802099] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-3e473594-52d0-4d0b-9e88-086245afaa87 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.822060] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1677.822245] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Fetch image to [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1677.822413] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1677.823134] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ef767ee-cd3b-4c01-82c7-7342b5641c63 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.829576] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba2a7d28-3bbd-432f-ad43-18e088f53a7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.838351] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-08602f28-0f9c-4ff9-9f37-5755a0e1710f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.869116] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81b4c322-ec50-4268-a6d1-28e88c7941c8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.874362] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8ccf3a87-6c11-4fc1-9e01-4b3314df2a10 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1677.895317] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1677.943329] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1678.014702] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1678.014888] env[68571]: DEBUG oslo_vmware.rw_handles [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1678.397591] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1678.397792] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1678.397970] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Deleting the datastore file [datastore1] e025f82d-a6a8-4dd4-b891-872f4b2fa176 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1678.398256] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-693bf3d6-0668-4038-bdee-cf6c52a8427b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1678.405607] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for the task: (returnval){ [ 1678.405607] env[68571]: value = "task-3467737" [ 1678.405607] env[68571]: _type = "Task" [ 1678.405607] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1678.412833] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Task: {'id': task-3467737, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1678.915440] env[68571]: DEBUG oslo_vmware.api [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Task: {'id': task-3467737, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.066789} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1678.915802] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1678.915845] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1678.916061] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1678.916253] env[68571]: INFO nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Took 1.64 seconds to destroy the instance on the hypervisor. [ 1678.918476] env[68571]: DEBUG nova.compute.claims [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1678.918679] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1678.918900] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.100559] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bc1cc752-6a09-4bf8-8980-e9ae5eac0980 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.107933] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2cf690c-227a-4698-8bee-1a1f3d131287 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.137027] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0444bec-ba52-4e31-9105-47b53431dc15 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.143175] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-572070a6-f1c7-4249-8901-068dd8416af0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.155428] env[68571]: DEBUG nova.compute.provider_tree [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1679.163734] env[68571]: DEBUG nova.scheduler.client.report [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1679.178539] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.260s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.179045] env[68571]: ERROR nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1679.179045] env[68571]: Faults: ['InvalidArgument'] [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Traceback (most recent call last): [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self.driver.spawn(context, instance, image_meta, [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self._fetch_image_if_missing(context, vi) [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] image_cache(vi, tmp_image_ds_loc) [ 1679.179045] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] vm_util.copy_virtual_disk( [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] session._wait_for_task(vmdk_copy_task) [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return self.wait_for_task(task_ref) [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return evt.wait() [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] result = hub.switch() [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] return self.greenlet.switch() [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1679.179459] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] self.f(*self.args, **self.kw) [ 1679.179862] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1679.179862] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] raise exceptions.translate_fault(task_info.error) [ 1679.179862] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1679.179862] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Faults: ['InvalidArgument'] [ 1679.179862] env[68571]: ERROR nova.compute.manager [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] [ 1679.179862] env[68571]: DEBUG nova.compute.utils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1679.181386] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Build of instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 was re-scheduled: A specified parameter was not correct: fileType [ 1679.181386] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1679.181754] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1679.181925] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1679.182107] env[68571]: DEBUG nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1679.182273] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1679.576256] env[68571]: DEBUG nova.network.neutron [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1679.593591] env[68571]: INFO nova.compute.manager [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Took 0.41 seconds to deallocate network for instance. [ 1679.690372] env[68571]: INFO nova.scheduler.client.report [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Deleted allocations for instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 [ 1679.711359] env[68571]: DEBUG oslo_concurrency.lockutils [None req-08340c64-38fd-448f-88eb-b5da9e5017ba tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 618.294s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.712505] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 421.438s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.712723] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Acquiring lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1679.712927] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.713109] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.714963] env[68571]: INFO nova.compute.manager [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Terminating instance [ 1679.716569] env[68571]: DEBUG nova.compute.manager [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1679.716766] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1679.717236] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e68b7154-9a58-478a-aa5d-2fa5187a4e84 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.722808] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1679.729033] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f31fe816-bdec-4b0a-a488-b6ab44533492 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.757264] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance e025f82d-a6a8-4dd4-b891-872f4b2fa176 could not be found. [ 1679.757458] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1679.757634] env[68571]: INFO nova.compute.manager [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1679.757917] env[68571]: DEBUG oslo.service.loopingcall [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1679.760158] env[68571]: DEBUG nova.compute.manager [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1679.760267] env[68571]: DEBUG nova.network.neutron [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1679.773171] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1679.773447] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1679.774914] env[68571]: INFO nova.compute.claims [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1679.785762] env[68571]: DEBUG nova.network.neutron [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1679.799072] env[68571]: INFO nova.compute.manager [-] [instance: e025f82d-a6a8-4dd4-b891-872f4b2fa176] Took 0.04 seconds to deallocate network for instance. [ 1679.886218] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d76d5332-f8d5-4301-b8ba-943d067f0ff8 tempest-ListServersNegativeTestJSON-857590735 tempest-ListServersNegativeTestJSON-857590735-project-member] Lock "e025f82d-a6a8-4dd4-b891-872f4b2fa176" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.174s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1679.956224] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f4a22451-4e34-4200-b2b8-00d741ca44c4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.963353] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d5e20fb6-1c05-4e7e-b562-f8afd7bdc286 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1679.993359] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1abad28-d071-472b-9328-b24549fbbbc8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.000148] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9b3b28f6-1819-4209-b48c-710c32d7bdbe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.012693] env[68571]: DEBUG nova.compute.provider_tree [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1680.024533] env[68571]: DEBUG nova.scheduler.client.report [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1680.037844] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1680.038363] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1680.068476] env[68571]: DEBUG nova.compute.utils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1680.069799] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1680.069973] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1680.077542] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1680.140129] env[68571]: DEBUG nova.policy [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9568742d9c6340efbde4b46080df74ac', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5d5917d06b21471c88e76a103cc66f32', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1680.151899] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1680.177889] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1680.178141] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1680.178302] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1680.178482] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1680.178646] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1680.178799] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1680.179017] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1680.179186] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1680.179351] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1680.179546] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1680.179814] env[68571]: DEBUG nova.virt.hardware [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1680.180922] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b576ded1-50dd-48b7-b410-bed2e7233fa9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.189357] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-37b45b2c-81aa-40d5-b909-754104dd3c4b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1680.484145] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Successfully created port: f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1681.044251] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Successfully updated port: f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1681.053872] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.054252] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquired lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.054252] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1681.092085] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1681.241839] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Updating instance_info_cache with network_info: [{"id": "f4487418-d697-47dc-b724-da86c28b3274", "address": "fa:16:3e:06:7d:af", "network": {"id": "8ff5194d-e331-4062-8c8e-1f354f0d85c6", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1458728586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5d5917d06b21471c88e76a103cc66f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4223acd2-30f7-440e-b975-60b30d931694", "external-id": "nsx-vlan-transportzone-647", "segmentation_id": 647, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4487418-d6", "ovs_interfaceid": "f4487418-d697-47dc-b724-da86c28b3274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1681.255540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Releasing lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1681.255832] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance network_info: |[{"id": "f4487418-d697-47dc-b724-da86c28b3274", "address": "fa:16:3e:06:7d:af", "network": {"id": "8ff5194d-e331-4062-8c8e-1f354f0d85c6", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1458728586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5d5917d06b21471c88e76a103cc66f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4223acd2-30f7-440e-b975-60b30d931694", "external-id": "nsx-vlan-transportzone-647", "segmentation_id": 647, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4487418-d6", "ovs_interfaceid": "f4487418-d697-47dc-b724-da86c28b3274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1681.256245] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:06:7d:af', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '4223acd2-30f7-440e-b975-60b30d931694', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'f4487418-d697-47dc-b724-da86c28b3274', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1681.263601] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Creating folder: Project (5d5917d06b21471c88e76a103cc66f32). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1681.264117] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b57111b-208d-47de-975b-991dc8accce9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.275780] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Created folder: Project (5d5917d06b21471c88e76a103cc66f32) in parent group-v692787. [ 1681.275950] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Creating folder: Instances. Parent ref: group-v692883. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1681.276178] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-57eeefab-aebb-4015-b3b1-6e01b97fcc37 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.284886] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Created folder: Instances in parent group-v692883. [ 1681.285124] env[68571]: DEBUG oslo.service.loopingcall [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1681.285301] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1681.285487] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-e65f57d4-1911-4443-82e6-7b6bf2be327a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.303759] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1681.303759] env[68571]: value = "task-3467740" [ 1681.303759] env[68571]: _type = "Task" [ 1681.303759] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.310704] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467740, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.616317] env[68571]: DEBUG nova.compute.manager [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Received event network-vif-plugged-f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1681.616549] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Acquiring lock "7fd03349-420c-4076-959c-31562e95098d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1681.616790] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Lock "7fd03349-420c-4076-959c-31562e95098d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1681.616935] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Lock "7fd03349-420c-4076-959c-31562e95098d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1681.617110] env[68571]: DEBUG nova.compute.manager [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] No waiting events found dispatching network-vif-plugged-f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1681.617275] env[68571]: WARNING nova.compute.manager [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Received unexpected event network-vif-plugged-f4487418-d697-47dc-b724-da86c28b3274 for instance with vm_state building and task_state spawning. [ 1681.617437] env[68571]: DEBUG nova.compute.manager [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Received event network-changed-f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1681.617592] env[68571]: DEBUG nova.compute.manager [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Refreshing instance network info cache due to event network-changed-f4487418-d697-47dc-b724-da86c28b3274. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1681.617775] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Acquiring lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.617910] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Acquired lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.618082] env[68571]: DEBUG nova.network.neutron [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Refreshing network info cache for port f4487418-d697-47dc-b724-da86c28b3274 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1681.813025] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467740, 'name': CreateVM_Task, 'duration_secs': 0.28026} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1681.813222] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1681.813860] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1681.814034] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1681.814353] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1681.814599] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-1722eaf8-c7bf-436f-9d90-9a4dae11aa4c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1681.820144] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for the task: (returnval){ [ 1681.820144] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]527e2484-d946-1bd4-356d-a4f57e12cf26" [ 1681.820144] env[68571]: _type = "Task" [ 1681.820144] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1681.827593] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]527e2484-d946-1bd4-356d-a4f57e12cf26, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1681.865256] env[68571]: DEBUG nova.network.neutron [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Updated VIF entry in instance network info cache for port f4487418-d697-47dc-b724-da86c28b3274. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1681.865624] env[68571]: DEBUG nova.network.neutron [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] [instance: 7fd03349-420c-4076-959c-31562e95098d] Updating instance_info_cache with network_info: [{"id": "f4487418-d697-47dc-b724-da86c28b3274", "address": "fa:16:3e:06:7d:af", "network": {"id": "8ff5194d-e331-4062-8c8e-1f354f0d85c6", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-1458728586-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "5d5917d06b21471c88e76a103cc66f32", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "4223acd2-30f7-440e-b975-60b30d931694", "external-id": "nsx-vlan-transportzone-647", "segmentation_id": 647, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapf4487418-d6", "ovs_interfaceid": "f4487418-d697-47dc-b724-da86c28b3274", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1681.875554] env[68571]: DEBUG oslo_concurrency.lockutils [req-ecfcbe39-9b13-4275-89e6-1c0fbf41daf3 req-49de011a-2148-4a75-9fb5-0932412931ac service nova] Releasing lock "refresh_cache-7fd03349-420c-4076-959c-31562e95098d" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1682.331106] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1682.331106] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1682.331106] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1700.496111] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.450923] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1721.470721] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 1721.470721] env[68571]: value = "domain-c8" [ 1721.470721] env[68571]: _type = "ClusterComputeResource" [ 1721.470721] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 1721.471938] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-31feae6c-ed85-44fd-b899-0ea7c3b0bc04 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1721.492628] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 10 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 1721.492791] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.492985] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 56c7e368-4032-4028-83f0-58b0cd3b3cbd {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.493522] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 47df3a07-1271-482c-bd3a-92fb9cef17bd {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.493756] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 73ba7761-3724-46ed-95c5-e93a6627a2d3 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.493928] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid d890a035-a14e-4be0-97c8-87edd9bb88e4 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.494098] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 9e8c8d14-144f-42e3-8556-796651b7b04f {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.494257] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 1f8dd053-ebd8-4ad9-a607-ab364a3320ca {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.494409] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.494558] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 8506e00f-2b77-4fa1-804a-8e548b78ee7d {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.494707] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 7fd03349-420c-4076-959c-31562e95098d {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 1721.495027] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.495265] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.495467] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.495661] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.495854] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.496059] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "9e8c8d14-144f-42e3-8556-796651b7b04f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.496256] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.496449] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.496639] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1721.496843] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "7fd03349-420c-4076-959c-31562e95098d" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1723.515751] env[68571]: WARNING oslo_vmware.rw_handles [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1723.515751] env[68571]: ERROR oslo_vmware.rw_handles [ 1723.516454] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1723.518347] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1723.518606] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Copying Virtual Disk [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/95e15759-b7d7-450c-8f57-587073aa1efc/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1723.518906] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-8a584276-1265-46ba-b0be-e3f67a559299 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1723.526336] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1723.526336] env[68571]: value = "task-3467741" [ 1723.526336] env[68571]: _type = "Task" [ 1723.526336] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1723.533774] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467741, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1723.535205] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1724.036382] env[68571]: DEBUG oslo_vmware.exceptions [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1724.036656] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1724.037229] env[68571]: ERROR nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1724.037229] env[68571]: Faults: ['InvalidArgument'] [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Traceback (most recent call last): [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] yield resources [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self.driver.spawn(context, instance, image_meta, [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self._fetch_image_if_missing(context, vi) [ 1724.037229] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] image_cache(vi, tmp_image_ds_loc) [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] vm_util.copy_virtual_disk( [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] session._wait_for_task(vmdk_copy_task) [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return self.wait_for_task(task_ref) [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return evt.wait() [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] result = hub.switch() [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1724.037857] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return self.greenlet.switch() [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self.f(*self.args, **self.kw) [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] raise exceptions.translate_fault(task_info.error) [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Faults: ['InvalidArgument'] [ 1724.038447] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] [ 1724.038447] env[68571]: INFO nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Terminating instance [ 1724.039092] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1724.039302] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1724.039538] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7bb453c0-537b-4933-894e-d4b75919b351 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.041706] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1724.041893] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1724.042609] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88be46ee-0ffd-453f-8709-c77a5311ca33 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.049468] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1724.049771] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-688c4c81-1c01-4c79-910e-257cd231069a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.052013] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1724.052180] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1724.053127] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-ab746c82-ce2e-4468-a429-488285fe0b84 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.057707] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for the task: (returnval){ [ 1724.057707] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]523617fe-17ca-9250-0476-aa41a72a7e35" [ 1724.057707] env[68571]: _type = "Task" [ 1724.057707] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1724.064901] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]523617fe-17ca-9250-0476-aa41a72a7e35, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1724.490058] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1724.568923] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1724.569416] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Creating directory with path [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1724.569535] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-7018dcd8-e1f1-477f-90a3-5989a8a4c3a3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.592320] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Created directory with path [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1724.592320] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Fetch image to [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1724.592320] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1724.593088] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f093732b-82ba-420a-8678-a87823239dac {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.600876] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f32a230-d242-4fc7-9009-469f839c7186 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.612531] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dfd974a4-40e9-4390-b68e-b738f31436c9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.641750] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ad79149c-083e-4c8e-b0b0-e6f940694a34 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.646875] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-73c4c776-3944-4ea1-b744-226e31ca3600 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1724.665745] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1724.712832] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1724.771581] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1724.771581] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1725.237883] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1725.238354] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1725.238471] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleting the datastore file [datastore1] a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1725.238803] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-078534d3-11e8-4f4b-868d-ff5eac813940 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.246078] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1725.246078] env[68571]: value = "task-3467743" [ 1725.246078] env[68571]: _type = "Task" [ 1725.246078] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1725.256371] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467743, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1725.489400] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1725.501354] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.501571] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.501957] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.501957] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1725.503006] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-def24203-6c3f-43d4-9e05-75f6a972b270 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.511797] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e8d5776-ec9d-45b8-9fa0-03c2c20b89e9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.525538] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b93a4da0-8758-4d49-8574-f42c41ebe635 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.531894] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a7ee1a55-21c1-4613-ab02-c4eb6ed5355c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.561299] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180937MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1725.561444] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.561643] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1725.709429] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.709731] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.709731] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.709855] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.709958] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.710093] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.710214] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.710330] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.710443] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.710555] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1725.721604] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1725.731802] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1725.732037] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1725.732229] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1725.747207] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 1725.757543] env[68571]: DEBUG oslo_vmware.api [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467743, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075093} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1725.757776] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1725.757957] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1725.758143] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1725.758317] env[68571]: INFO nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Took 1.72 seconds to destroy the instance on the hypervisor. [ 1725.760318] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 1725.760495] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 1725.762405] env[68571]: DEBUG nova.compute.claims [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1725.762614] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1725.770943] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 1725.786666] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 1725.910437] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ed0b2568-0ada-408e-a638-5281a384171e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.917680] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a651138-49c8-4396-846d-217c2a0ca145 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.947067] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b22db93-05e6-44a2-b2f8-d87daf6d6c3e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.954461] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64279520-e6e2-4e0b-89d7-f1bc1e9ca85d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1725.968105] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1725.976359] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1725.990193] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1725.990369] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.429s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1725.990658] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.228s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.171145] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7bfd42bd-5dea-4345-bc5f-6877cf5cdd69 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.177150] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8e5bdf09-0c27-42a7-a9d9-338fdc00db43 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.208645] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9a9a8225-8a8a-4e6d-89af-61613fa3855f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.216218] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-024413a2-cfd3-4f1c-b649-afb9cdba8317 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.229936] env[68571]: DEBUG nova.compute.provider_tree [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1726.238311] env[68571]: DEBUG nova.scheduler.client.report [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1726.260614] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.270s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1726.261346] env[68571]: ERROR nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1726.261346] env[68571]: Faults: ['InvalidArgument'] [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Traceback (most recent call last): [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self.driver.spawn(context, instance, image_meta, [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self._fetch_image_if_missing(context, vi) [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] image_cache(vi, tmp_image_ds_loc) [ 1726.261346] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] vm_util.copy_virtual_disk( [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] session._wait_for_task(vmdk_copy_task) [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return self.wait_for_task(task_ref) [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return evt.wait() [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] result = hub.switch() [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] return self.greenlet.switch() [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1726.261945] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] self.f(*self.args, **self.kw) [ 1726.262797] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1726.262797] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] raise exceptions.translate_fault(task_info.error) [ 1726.262797] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1726.262797] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Faults: ['InvalidArgument'] [ 1726.262797] env[68571]: ERROR nova.compute.manager [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] [ 1726.262797] env[68571]: DEBUG nova.compute.utils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1726.263735] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Build of instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac was re-scheduled: A specified parameter was not correct: fileType [ 1726.263735] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1726.264132] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1726.264310] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1726.264487] env[68571]: DEBUG nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1726.264733] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1726.610316] env[68571]: DEBUG nova.network.neutron [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1726.625517] env[68571]: INFO nova.compute.manager [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Took 0.36 seconds to deallocate network for instance. [ 1726.735154] env[68571]: INFO nova.scheduler.client.report [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted allocations for instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac [ 1726.761338] env[68571]: DEBUG oslo_concurrency.lockutils [None req-5cea2df6-5f76-4a6e-8ff7-156b5a0a11a7 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 615.625s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1726.761338] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 420.325s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.761704] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1726.761769] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.761878] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1726.765687] env[68571]: INFO nova.compute.manager [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Terminating instance [ 1726.767414] env[68571]: DEBUG nova.compute.manager [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1726.767604] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1726.767865] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5baa2d58-95f8-4259-b0b8-bfad605ee172 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.777618] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-64ff0993-16d9-49d6-8024-97ce8dbf8cda {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1726.788435] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1726.808969] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac could not be found. [ 1726.809306] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1726.809489] env[68571]: INFO nova.compute.manager [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1726.809724] env[68571]: DEBUG oslo.service.loopingcall [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1726.809946] env[68571]: DEBUG nova.compute.manager [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1726.810364] env[68571]: DEBUG nova.network.neutron [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1726.835277] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1726.835546] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.837478] env[68571]: INFO nova.compute.claims [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1726.840179] env[68571]: DEBUG nova.network.neutron [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1726.847731] env[68571]: INFO nova.compute.manager [-] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] Took 0.04 seconds to deallocate network for instance. [ 1726.945250] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b115017e-4635-4cad-a5ed-12b223dcd745 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.184s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1726.946089] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 5.451s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1726.946284] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac] During sync_power_state the instance has a pending task (deleting). Skip. [ 1726.946457] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "a9eda0ce-743a-48f4-ad9d-0f2ad2eca9ac" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1727.016119] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec1cab4f-7d4f-44e0-a25f-e9724796aec3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.024218] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-88cf1c35-e452-4b51-8c57-2e9abff32134 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.054368] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f0028a2c-1b2e-40f0-9755-7d2926640ea7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.061488] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-497d2973-3052-4129-8bd3-beac7c0c8b6b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.074255] env[68571]: DEBUG nova.compute.provider_tree [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1727.083221] env[68571]: DEBUG nova.scheduler.client.report [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1727.100285] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.265s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1727.100756] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1727.130997] env[68571]: DEBUG nova.compute.utils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1727.132517] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1727.132696] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1727.142932] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1727.190302] env[68571]: DEBUG nova.policy [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '2748e40bf2f64ac6a0816dd0e9767a3b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'e7abc16f11794107bc45dfbe74a5a7ca', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1727.206752] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1727.231655] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1727.231938] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1727.232226] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1727.232475] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1727.232654] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1727.232839] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1727.233105] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1727.233304] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1727.233512] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1727.233719] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1727.233936] env[68571]: DEBUG nova.virt.hardware [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1727.234962] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-62575ab3-0e05-460e-a6b3-b342b7521f49 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.243799] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24bd582c-5117-4527-9635-4709e9bccb7f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1727.485493] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Successfully created port: 8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1727.988226] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1727.988601] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1728.057170] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Successfully updated port: 8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1728.067984] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.068142] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquired lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.068381] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1728.104186] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1728.254054] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Updating instance_info_cache with network_info: [{"id": "8d77159f-78cf-4687-ae54-133fc0667bda", "address": "fa:16:3e:7c:0a:06", "network": {"id": "9eb5305a-a3b6-4b85-bd01-e69176fec6da", "bridge": "br-int", "label": "tempest-ServersTestJSON-89491618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7abc16f11794107bc45dfbe74a5a7ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77159f-78", "ovs_interfaceid": "8d77159f-78cf-4687-ae54-133fc0667bda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1728.266303] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Releasing lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1728.266573] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance network_info: |[{"id": "8d77159f-78cf-4687-ae54-133fc0667bda", "address": "fa:16:3e:7c:0a:06", "network": {"id": "9eb5305a-a3b6-4b85-bd01-e69176fec6da", "bridge": "br-int", "label": "tempest-ServersTestJSON-89491618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7abc16f11794107bc45dfbe74a5a7ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77159f-78", "ovs_interfaceid": "8d77159f-78cf-4687-ae54-133fc0667bda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1728.266945] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:7c:0a:06', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '274afb4c-04df-4213-8ad2-8f48a10d78a8', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '8d77159f-78cf-4687-ae54-133fc0667bda', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1728.274387] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Creating folder: Project (e7abc16f11794107bc45dfbe74a5a7ca). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1728.274846] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-666deeb7-fc2c-4072-b568-3d966594962b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.286504] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Created folder: Project (e7abc16f11794107bc45dfbe74a5a7ca) in parent group-v692787. [ 1728.286676] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Creating folder: Instances. Parent ref: group-v692886. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1728.286877] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-0b82019a-04d9-4ca7-ba32-1acfe63aae24 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.295214] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Created folder: Instances in parent group-v692886. [ 1728.295431] env[68571]: DEBUG oslo.service.loopingcall [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1728.295599] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1728.295779] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-b5a23597-f30b-4829-98dc-ea89ad4d21bf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.315258] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1728.315258] env[68571]: value = "task-3467746" [ 1728.315258] env[68571]: _type = "Task" [ 1728.315258] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1728.321871] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467746, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1728.779751] env[68571]: DEBUG nova.compute.manager [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Received event network-vif-plugged-8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1728.779984] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Acquiring lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1728.780360] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1728.780538] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1728.780707] env[68571]: DEBUG nova.compute.manager [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] No waiting events found dispatching network-vif-plugged-8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1728.780870] env[68571]: WARNING nova.compute.manager [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Received unexpected event network-vif-plugged-8d77159f-78cf-4687-ae54-133fc0667bda for instance with vm_state building and task_state spawning. [ 1728.781040] env[68571]: DEBUG nova.compute.manager [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Received event network-changed-8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1728.781200] env[68571]: DEBUG nova.compute.manager [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Refreshing instance network info cache due to event network-changed-8d77159f-78cf-4687-ae54-133fc0667bda. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1728.781383] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Acquiring lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.781524] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Acquired lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.781664] env[68571]: DEBUG nova.network.neutron [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Refreshing network info cache for port 8d77159f-78cf-4687-ae54-133fc0667bda {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1728.827884] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467746, 'name': CreateVM_Task, 'duration_secs': 0.289734} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1728.828059] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1728.828681] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1728.828840] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1728.829433] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1728.829433] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-49f8d116-006a-4352-8687-63f6e0a222fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1728.834725] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for the task: (returnval){ [ 1728.834725] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52cb0f25-27fe-17b8-d37c-d584fb186ae5" [ 1728.834725] env[68571]: _type = "Task" [ 1728.834725] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1728.842745] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52cb0f25-27fe-17b8-d37c-d584fb186ae5, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1729.206952] env[68571]: DEBUG nova.network.neutron [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Updated VIF entry in instance network info cache for port 8d77159f-78cf-4687-ae54-133fc0667bda. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1729.207374] env[68571]: DEBUG nova.network.neutron [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Updating instance_info_cache with network_info: [{"id": "8d77159f-78cf-4687-ae54-133fc0667bda", "address": "fa:16:3e:7c:0a:06", "network": {"id": "9eb5305a-a3b6-4b85-bd01-e69176fec6da", "bridge": "br-int", "label": "tempest-ServersTestJSON-89491618-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "e7abc16f11794107bc45dfbe74a5a7ca", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "274afb4c-04df-4213-8ad2-8f48a10d78a8", "external-id": "nsx-vlan-transportzone-515", "segmentation_id": 515, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap8d77159f-78", "ovs_interfaceid": "8d77159f-78cf-4687-ae54-133fc0667bda", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1729.216766] env[68571]: DEBUG oslo_concurrency.lockutils [req-ab05b300-43dc-438c-a5d7-b67584851786 req-5e22289d-84bb-4b81-ac0e-2230d087b410 service nova] Releasing lock "refresh_cache-5deee3f1-70a0-4c0d-bda6-365235ca0d78" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1729.345093] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1729.345349] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1729.345555] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1729.489034] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1731.490293] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1731.490584] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1731.490584] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1731.513807] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.513984] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514114] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514245] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514372] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514494] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514616] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514735] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514855] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.514972] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1731.515106] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1733.489677] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1733.490084] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 1734.500911] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1734.501285] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1735.490836] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1740.490209] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1742.492677] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1742.515492] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1742.515694] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 1742.523971] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 0 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 1745.832644] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1748.962937] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1748.963510] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1768.898974] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "7fd03349-420c-4076-959c-31562e95098d" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1773.531716] env[68571]: WARNING oslo_vmware.rw_handles [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1773.531716] env[68571]: ERROR oslo_vmware.rw_handles [ 1773.532523] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1773.534188] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1773.534428] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Copying Virtual Disk [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/2731cd94-874b-4f6b-ac51-49af2558f9a7/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1773.534724] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-df3e4cba-412d-4863-80fd-92f2ae45013c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1773.542555] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for the task: (returnval){ [ 1773.542555] env[68571]: value = "task-3467747" [ 1773.542555] env[68571]: _type = "Task" [ 1773.542555] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1773.549924] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Task: {'id': task-3467747, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1774.053105] env[68571]: DEBUG oslo_vmware.exceptions [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1774.053362] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1774.053903] env[68571]: ERROR nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.053903] env[68571]: Faults: ['InvalidArgument'] [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Traceback (most recent call last): [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] yield resources [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self.driver.spawn(context, instance, image_meta, [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self._fetch_image_if_missing(context, vi) [ 1774.053903] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] image_cache(vi, tmp_image_ds_loc) [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] vm_util.copy_virtual_disk( [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] session._wait_for_task(vmdk_copy_task) [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return self.wait_for_task(task_ref) [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return evt.wait() [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] result = hub.switch() [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1774.054497] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return self.greenlet.switch() [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self.f(*self.args, **self.kw) [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] raise exceptions.translate_fault(task_info.error) [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Faults: ['InvalidArgument'] [ 1774.054874] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] [ 1774.054874] env[68571]: INFO nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Terminating instance [ 1774.055742] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1774.055960] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1774.056201] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-15cc528e-a3aa-41bb-ab6c-af309546fd78 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.058317] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1774.058508] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1774.059220] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e3a4d3f-0206-418a-a362-e0c6148b4476 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.066018] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1774.066258] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-70f3ca9f-8161-40d2-8bc9-65319c6b32f9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.068261] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1774.068431] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1774.069380] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-fadc6489-57e7-4ed6-91e4-aed77a5cc7a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.073851] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for the task: (returnval){ [ 1774.073851] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b86ce7-732f-b996-bb7d-570037056ad1" [ 1774.073851] env[68571]: _type = "Task" [ 1774.073851] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1774.087197] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1774.087415] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Creating directory with path [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1774.087622] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f8dc9f4c-2f29-49d8-8541-66e2c5ffadde {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.106072] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Created directory with path [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1774.106262] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Fetch image to [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1774.106434] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1774.107151] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82b1eb82-6b2b-43d6-9709-ce3c5b26df71 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.114819] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1c5c1881-e010-4e85-9f83-799bb498c4cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.123728] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bac20d33-62fb-4004-a3da-9fa59201c515 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.154613] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cf7917ff-70bc-4717-a4ef-ce7c5b1d1c72 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.157250] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1774.157445] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1774.157647] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Deleting the datastore file [datastore1] 56c7e368-4032-4028-83f0-58b0cd3b3cbd {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1774.157883] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-aaa6b23a-ad23-427c-81cb-f9fce8dfb628 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.164235] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-d056024e-8417-4017-a169-a5db7662c7ad {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.165919] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for the task: (returnval){ [ 1774.165919] env[68571]: value = "task-3467749" [ 1774.165919] env[68571]: _type = "Task" [ 1774.165919] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1774.173835] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Task: {'id': task-3467749, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1774.189725] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1774.239602] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1774.297895] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1774.298105] env[68571]: DEBUG oslo_vmware.rw_handles [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1774.675714] env[68571]: DEBUG oslo_vmware.api [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Task: {'id': task-3467749, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.067596} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1774.676055] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1774.676158] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1774.676330] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1774.676501] env[68571]: INFO nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1774.678592] env[68571]: DEBUG nova.compute.claims [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1774.678765] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1774.678976] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1774.850278] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-24b7b168-3a22-4860-990c-8a5ccd2c56af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.857234] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cbf31e21-bbb1-42b4-a086-d5eb0dc99adc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.886796] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e44b02ad-477f-4ae5-bf3e-f8fb30777791 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.893385] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41825bf8-20e4-4298-98e9-3d0af9b57fa5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1774.905662] env[68571]: DEBUG nova.compute.provider_tree [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1774.913690] env[68571]: DEBUG nova.scheduler.client.report [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1774.927573] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.248s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1774.928067] env[68571]: ERROR nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.928067] env[68571]: Faults: ['InvalidArgument'] [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Traceback (most recent call last): [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self.driver.spawn(context, instance, image_meta, [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self._fetch_image_if_missing(context, vi) [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] image_cache(vi, tmp_image_ds_loc) [ 1774.928067] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] vm_util.copy_virtual_disk( [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] session._wait_for_task(vmdk_copy_task) [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return self.wait_for_task(task_ref) [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return evt.wait() [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] result = hub.switch() [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] return self.greenlet.switch() [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1774.928419] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] self.f(*self.args, **self.kw) [ 1774.928761] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1774.928761] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] raise exceptions.translate_fault(task_info.error) [ 1774.928761] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1774.928761] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Faults: ['InvalidArgument'] [ 1774.928761] env[68571]: ERROR nova.compute.manager [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] [ 1774.928761] env[68571]: DEBUG nova.compute.utils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1774.930027] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Build of instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd was re-scheduled: A specified parameter was not correct: fileType [ 1774.930027] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1774.930394] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1774.930611] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1774.930766] env[68571]: DEBUG nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1774.930928] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1775.328241] env[68571]: DEBUG nova.network.neutron [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1775.342134] env[68571]: INFO nova.compute.manager [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Took 0.41 seconds to deallocate network for instance. [ 1775.438495] env[68571]: INFO nova.scheduler.client.report [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Deleted allocations for instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd [ 1775.460309] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e522f9b5-5e9d-4f08-97c6-706d345e3f31 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 623.911s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.461981] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 427.592s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.462327] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Acquiring lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1775.462539] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.462748] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.464675] env[68571]: INFO nova.compute.manager [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Terminating instance [ 1775.466813] env[68571]: DEBUG nova.compute.manager [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1775.467070] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1775.467373] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-d29ce711-b9da-41e3-81aa-c4927e94252f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.471524] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1775.482531] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-012869bd-2ba9-406a-b2e1-c1cf5b1c8cb1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.513585] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 56c7e368-4032-4028-83f0-58b0cd3b3cbd could not be found. [ 1775.513805] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1775.513986] env[68571]: INFO nova.compute.manager [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Took 0.05 seconds to destroy the instance on the hypervisor. [ 1775.514255] env[68571]: DEBUG oslo.service.loopingcall [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1775.515277] env[68571]: DEBUG nova.compute.manager [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1775.515344] env[68571]: DEBUG nova.network.neutron [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1775.535649] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1775.535887] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.539153] env[68571]: INFO nova.compute.claims [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1775.547491] env[68571]: DEBUG nova.network.neutron [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1775.558702] env[68571]: INFO nova.compute.manager [-] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] Took 0.04 seconds to deallocate network for instance. [ 1775.657839] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f6bf90a6-4734-4105-94a5-913444bff1d6 tempest-FloatingIPsAssociationTestJSON-1247097416 tempest-FloatingIPsAssociationTestJSON-1247097416-project-member] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.196s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.659408] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 54.164s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1775.659649] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 56c7e368-4032-4028-83f0-58b0cd3b3cbd] During sync_power_state the instance has a pending task (deleting). Skip. [ 1775.659830] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "56c7e368-4032-4028-83f0-58b0cd3b3cbd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.721482] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0694727f-d072-450a-8437-174c1a5c6859 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.729200] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-22225ad3-6ace-48ae-befb-3a77d0e1d9d8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.758469] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e817d583-f1c4-4ce5-b973-e55a5d9913d9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.765318] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b8b8f554-19c2-4ea8-9e1b-e33f208ac455 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.778052] env[68571]: DEBUG nova.compute.provider_tree [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1775.785981] env[68571]: DEBUG nova.scheduler.client.report [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1775.799621] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.264s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1775.800097] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1775.830656] env[68571]: DEBUG nova.compute.utils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1775.833928] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1775.834224] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1775.842570] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1775.899822] env[68571]: DEBUG nova.policy [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b21fda9650f1447a81a5994f05fc8078', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '157830f5757b429383d95b2b4c0a384c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1775.902815] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1775.927420] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1775.927692] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1775.927856] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1775.928050] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1775.928200] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1775.928346] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1775.928551] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1775.928710] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1775.928872] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1775.929050] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1775.929228] env[68571]: DEBUG nova.virt.hardware [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1775.930170] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d53b973d-631d-440a-8d4d-6d2db34fcd5b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1775.938123] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4533057c-6ac5-4d14-8de1-66112caae13e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.175982] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Successfully created port: 74925514-2cba-4724-a166-71275f959513 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1776.724359] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Successfully updated port: 74925514-2cba-4724-a166-71275f959513 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1776.740346] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1776.740533] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1776.740683] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1776.755283] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1776.785975] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1776.943930] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Updating instance_info_cache with network_info: [{"id": "74925514-2cba-4724-a166-71275f959513", "address": "fa:16:3e:1b:39:f7", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74925514-2c", "ovs_interfaceid": "74925514-2cba-4724-a166-71275f959513", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1776.955580] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1776.955877] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance network_info: |[{"id": "74925514-2cba-4724-a166-71275f959513", "address": "fa:16:3e:1b:39:f7", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74925514-2c", "ovs_interfaceid": "74925514-2cba-4724-a166-71275f959513", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1776.956298] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:1b:39:f7', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c979f78-8597-41f8-b1de-995014032689', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': '74925514-2cba-4724-a166-71275f959513', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1776.964079] env[68571]: DEBUG oslo.service.loopingcall [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1776.964513] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1776.964740] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-45f78371-98b1-4ded-aaea-72dc5d40a695 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1776.985424] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1776.985424] env[68571]: value = "task-3467750" [ 1776.985424] env[68571]: _type = "Task" [ 1776.985424] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1776.993060] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467750, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1777.366521] env[68571]: DEBUG nova.compute.manager [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Received event network-vif-plugged-74925514-2cba-4724-a166-71275f959513 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1777.366814] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Acquiring lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1777.367238] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1777.367449] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1777.367803] env[68571]: DEBUG nova.compute.manager [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] No waiting events found dispatching network-vif-plugged-74925514-2cba-4724-a166-71275f959513 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1777.367941] env[68571]: WARNING nova.compute.manager [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Received unexpected event network-vif-plugged-74925514-2cba-4724-a166-71275f959513 for instance with vm_state building and task_state spawning. [ 1777.368154] env[68571]: DEBUG nova.compute.manager [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Received event network-changed-74925514-2cba-4724-a166-71275f959513 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1777.368393] env[68571]: DEBUG nova.compute.manager [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Refreshing instance network info cache due to event network-changed-74925514-2cba-4724-a166-71275f959513. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1777.368660] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Acquiring lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1777.368863] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Acquired lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1777.369143] env[68571]: DEBUG nova.network.neutron [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Refreshing network info cache for port 74925514-2cba-4724-a166-71275f959513 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1777.494532] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467750, 'name': CreateVM_Task, 'duration_secs': 0.274738} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1777.496487] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1777.497136] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1777.497303] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1777.497641] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1777.498142] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-8399578b-8fdc-4346-a06f-33acd0459191 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1777.501994] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 1777.501994] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b0de0c-ff45-dff5-6020-2a4db0de88c6" [ 1777.501994] env[68571]: _type = "Task" [ 1777.501994] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1777.511199] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52b0de0c-ff45-dff5-6020-2a4db0de88c6, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1777.604901] env[68571]: DEBUG nova.network.neutron [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Updated VIF entry in instance network info cache for port 74925514-2cba-4724-a166-71275f959513. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1777.605261] env[68571]: DEBUG nova.network.neutron [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Updating instance_info_cache with network_info: [{"id": "74925514-2cba-4724-a166-71275f959513", "address": "fa:16:3e:1b:39:f7", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tap74925514-2c", "ovs_interfaceid": "74925514-2cba-4724-a166-71275f959513", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1777.614767] env[68571]: DEBUG oslo_concurrency.lockutils [req-03f3e0f8-8b4f-4971-9cdc-6cec195feb12 req-10793a1b-242a-4015-838a-a5e2313f2de9 service nova] Releasing lock "refresh_cache-f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1778.013408] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1778.013408] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1778.013741] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1783.498744] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1785.489702] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1785.490132] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1785.502154] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1785.502357] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1785.502527] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1785.502684] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1785.503777] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-af7b0180-8ae1-436a-9781-4c280d98bd73 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.511756] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9dfac851-3b27-4c2b-ad76-9a2d0337da6b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.525610] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8d9609ea-25f7-4440-932f-4b6a21e7ecf5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.531631] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-deb5580f-9c9c-4b8b-8242-ee54aecbf271 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.559335] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180890MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1785.559455] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1785.559670] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1785.631622] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.631805] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.631934] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632069] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632191] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632308] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632422] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632535] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632685] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.632805] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1785.643360] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1764}} [ 1785.643568] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1785.643714] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1785.763922] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9f1683c5-e0c0-4c62-b90a-a03160942e48 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.771334] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba5b2218-7c07-4f88-8070-ef50b5a2f786 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.800105] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eb2c63e5-909f-4f81-b1c1-5bd538c369e7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.806959] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5293a79-0873-4204-8807-2b47e43aaa3c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1785.819414] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1785.827700] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1785.842234] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1785.842412] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.283s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1787.842599] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1789.484136] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.489794] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1791.489794] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1791.490172] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1791.509861] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510119] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510314] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510448] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510601] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510704] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510841] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.510958] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.511087] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.511205] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1791.511323] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1791.511808] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.489862] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.490234] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1795.490306] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1801.320089] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1822.400460] env[68571]: WARNING oslo_vmware.rw_handles [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1822.400460] env[68571]: ERROR oslo_vmware.rw_handles [ 1822.401274] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1822.402931] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1822.403187] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Copying Virtual Disk [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/2d0b53ad-ed3f-4ef9-98c9-d6b4d776a420/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1822.403492] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-912d64fe-b9ac-46f6-a2a4-5e9017dbae17 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1822.411594] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for the task: (returnval){ [ 1822.411594] env[68571]: value = "task-3467751" [ 1822.411594] env[68571]: _type = "Task" [ 1822.411594] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1822.419765] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Task: {'id': task-3467751, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1822.922054] env[68571]: DEBUG oslo_vmware.exceptions [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1822.922343] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1822.922902] env[68571]: ERROR nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1822.922902] env[68571]: Faults: ['InvalidArgument'] [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Traceback (most recent call last): [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] yield resources [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self.driver.spawn(context, instance, image_meta, [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self._fetch_image_if_missing(context, vi) [ 1822.922902] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] image_cache(vi, tmp_image_ds_loc) [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] vm_util.copy_virtual_disk( [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] session._wait_for_task(vmdk_copy_task) [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return self.wait_for_task(task_ref) [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return evt.wait() [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] result = hub.switch() [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1822.923707] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return self.greenlet.switch() [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self.f(*self.args, **self.kw) [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] raise exceptions.translate_fault(task_info.error) [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Faults: ['InvalidArgument'] [ 1822.924764] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] [ 1822.924764] env[68571]: INFO nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Terminating instance [ 1822.924764] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1822.925460] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1822.925460] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-65705e8a-dc89-458c-bb06-254f0b07a112 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1822.927382] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1822.927564] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1822.928321] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fb2d1c1-434f-44c6-8c20-2bfb0ad2b6fa {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1822.935402] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1822.935700] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5436d186-a52d-44f0-be69-5fcbaba206de {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1822.938865] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1822.939052] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1822.939712] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-e0995eeb-a4c9-4eae-ba42-3a7596b26b17 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1822.944428] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 1822.944428] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52639586-ae44-0dfb-f97e-480ed0a80c4f" [ 1822.944428] env[68571]: _type = "Task" [ 1822.944428] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1822.951586] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52639586-ae44-0dfb-f97e-480ed0a80c4f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1823.004593] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1823.004815] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1823.004994] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Deleting the datastore file [datastore1] 47df3a07-1271-482c-bd3a-92fb9cef17bd {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1823.005282] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-502dd638-624a-44cb-a7e1-02385825863d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.011977] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for the task: (returnval){ [ 1823.011977] env[68571]: value = "task-3467753" [ 1823.011977] env[68571]: _type = "Task" [ 1823.011977] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1823.019297] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Task: {'id': task-3467753, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1823.455288] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1823.455658] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1823.455763] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0cb39f61-39e1-4260-a938-4c71817ff9ad {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.467126] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1823.467322] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Fetch image to [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1823.467494] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1823.468213] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6c5bd69c-9b87-4748-8217-99157a1b6f65 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.474459] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6ac1335a-e415-4468-a159-3c820909a903 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.483336] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-52167085-6dd2-4912-bcb0-551ce3c6839e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.517235] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5861f29d-a704-4e82-ad6e-f8c6480d3d0b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.523977] env[68571]: DEBUG oslo_vmware.api [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Task: {'id': task-3467753, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.078915} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1823.525367] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1823.525560] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1823.525729] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1823.525898] env[68571]: INFO nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1823.527626] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-868f89aa-9051-413a-a807-0ecd4d332a36 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.529533] env[68571]: DEBUG nova.compute.claims [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1823.529702] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1823.529929] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1823.553050] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1823.609392] env[68571]: DEBUG oslo_vmware.rw_handles [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1823.670089] env[68571]: DEBUG oslo_vmware.rw_handles [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1823.670309] env[68571]: DEBUG oslo_vmware.rw_handles [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Closing write handle for https://esx7c2n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1823.762623] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-48d1a166-3d33-4f5a-b31c-89935b413259 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.770144] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-42c62ff5-1cb1-4fc0-b15a-cff588f4f1d4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.798250] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b28e4150-c8e0-4d8d-a147-51b2e96f4958 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.804924] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aeeeb0eb-8e08-449d-abfa-6707da4fefbf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1823.817261] env[68571]: DEBUG nova.compute.provider_tree [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1823.825786] env[68571]: DEBUG nova.scheduler.client.report [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1823.842760] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.313s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1823.843296] env[68571]: ERROR nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1823.843296] env[68571]: Faults: ['InvalidArgument'] [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Traceback (most recent call last): [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self.driver.spawn(context, instance, image_meta, [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self._fetch_image_if_missing(context, vi) [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] image_cache(vi, tmp_image_ds_loc) [ 1823.843296] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] vm_util.copy_virtual_disk( [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] session._wait_for_task(vmdk_copy_task) [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return self.wait_for_task(task_ref) [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return evt.wait() [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] result = hub.switch() [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] return self.greenlet.switch() [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1823.843717] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] self.f(*self.args, **self.kw) [ 1823.844148] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1823.844148] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] raise exceptions.translate_fault(task_info.error) [ 1823.844148] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1823.844148] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Faults: ['InvalidArgument'] [ 1823.844148] env[68571]: ERROR nova.compute.manager [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] [ 1823.844148] env[68571]: DEBUG nova.compute.utils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1823.845321] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Build of instance 47df3a07-1271-482c-bd3a-92fb9cef17bd was re-scheduled: A specified parameter was not correct: fileType [ 1823.845321] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1823.845686] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1823.845856] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1823.846040] env[68571]: DEBUG nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1823.846224] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1824.225105] env[68571]: DEBUG nova.network.neutron [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1824.234937] env[68571]: INFO nova.compute.manager [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Took 0.39 seconds to deallocate network for instance. [ 1824.321161] env[68571]: INFO nova.scheduler.client.report [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Deleted allocations for instance 47df3a07-1271-482c-bd3a-92fb9cef17bd [ 1824.342509] env[68571]: DEBUG oslo_concurrency.lockutils [None req-dd7ecb32-f4ac-41a6-a714-4923c4d7dfdd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 625.026s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1824.343980] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 428.839s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1824.344226] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Acquiring lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1824.344433] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1824.344599] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1824.346431] env[68571]: INFO nova.compute.manager [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Terminating instance [ 1824.348152] env[68571]: DEBUG nova.compute.manager [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1824.348367] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1824.348862] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a614cf4d-945d-4897-9690-8af1aca39032 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.358429] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9d92c6f8-0b32-496c-a77b-b5b984d061f4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.389868] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 47df3a07-1271-482c-bd3a-92fb9cef17bd could not be found. [ 1824.390096] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1824.390306] env[68571]: INFO nova.compute.manager [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1824.390576] env[68571]: DEBUG oslo.service.loopingcall [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1824.390917] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1824.394009] env[68571]: DEBUG nova.compute.manager [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1824.394181] env[68571]: DEBUG nova.network.neutron [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1824.421087] env[68571]: DEBUG nova.network.neutron [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1824.435332] env[68571]: INFO nova.compute.manager [-] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] Took 0.04 seconds to deallocate network for instance. [ 1824.456173] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1824.456457] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1824.457877] env[68571]: INFO nova.compute.claims [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1824.522946] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ffa0f883-166e-400e-90d6-ab6437f668dd tempest-ServerActionsTestJSON-1302242736 tempest-ServerActionsTestJSON-1302242736-project-member] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.179s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1824.523762] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 103.028s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1824.523965] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 47df3a07-1271-482c-bd3a-92fb9cef17bd] During sync_power_state the instance has a pending task (deleting). Skip. [ 1824.524170] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "47df3a07-1271-482c-bd3a-92fb9cef17bd" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1824.620659] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b112bdf6-93bf-4dff-8a2d-ef5af9cdb6a9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.628172] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9280671b-c976-415a-824d-def92944a042 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.658964] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4839b098-6a30-4587-8a64-a6747515ba81 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.665675] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bba81158-a6be-4834-ac7b-3f53d3c2d005 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.678594] env[68571]: DEBUG nova.compute.provider_tree [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1824.687842] env[68571]: DEBUG nova.scheduler.client.report [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1824.700757] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.244s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1824.701254] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1824.734051] env[68571]: DEBUG nova.compute.utils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1824.735601] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1824.735773] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1824.744325] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1824.800291] env[68571]: DEBUG nova.policy [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd34e5361b36c4dc5824b0f42a37e6bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '290427ab03f446ce9297ea393c083ff9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1824.809413] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1824.834925] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1824.835169] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1824.835327] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1824.835506] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1824.835648] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1824.835790] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1824.835993] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1824.836167] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1824.836338] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1824.836500] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1824.836671] env[68571]: DEBUG nova.virt.hardware [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1824.837529] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fd137cbc-7b81-43a8-b3c6-661e3d3b0fb1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1824.845330] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d02427c-e8e0-403e-af1c-bd7a276d3d9a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.085906] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Successfully created port: a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1825.648573] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Successfully updated port: a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1825.660927] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1825.661217] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1825.661441] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1825.697235] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1825.847659] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Updating instance_info_cache with network_info: [{"id": "a0da8755-47be-4650-aa22-6b837fb291ad", "address": "fa:16:3e:5d:2e:23", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0da8755-47", "ovs_interfaceid": "a0da8755-47be-4650-aa22-6b837fb291ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1825.858396] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1825.860197] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance network_info: |[{"id": "a0da8755-47be-4650-aa22-6b837fb291ad", "address": "fa:16:3e:5d:2e:23", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0da8755-47", "ovs_interfaceid": "a0da8755-47be-4650-aa22-6b837fb291ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1825.860335] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:5d:2e:23', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a0da8755-47be-4650-aa22-6b837fb291ad', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1825.867229] env[68571]: DEBUG oslo.service.loopingcall [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1825.867229] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1825.867318] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-fd3f6ff2-54a7-4212-86e2-b3ade860cb1b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1825.888333] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1825.888333] env[68571]: value = "task-3467754" [ 1825.888333] env[68571]: _type = "Task" [ 1825.888333] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1825.895736] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467754, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1826.259049] env[68571]: DEBUG nova.compute.manager [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Received event network-vif-plugged-a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1826.259267] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Acquiring lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1826.259471] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1826.259640] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1826.259803] env[68571]: DEBUG nova.compute.manager [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] No waiting events found dispatching network-vif-plugged-a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1826.259975] env[68571]: WARNING nova.compute.manager [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Received unexpected event network-vif-plugged-a0da8755-47be-4650-aa22-6b837fb291ad for instance with vm_state building and task_state spawning. [ 1826.260190] env[68571]: DEBUG nova.compute.manager [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Received event network-changed-a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1826.260372] env[68571]: DEBUG nova.compute.manager [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Refreshing instance network info cache due to event network-changed-a0da8755-47be-4650-aa22-6b837fb291ad. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 1826.260558] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Acquiring lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1826.260691] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Acquired lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1826.260841] env[68571]: DEBUG nova.network.neutron [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Refreshing network info cache for port a0da8755-47be-4650-aa22-6b837fb291ad {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 1826.400539] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467754, 'name': CreateVM_Task, 'duration_secs': 0.27639} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1826.400740] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1826.401550] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1826.401707] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1826.402091] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1826.402765] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d3b0ef5f-aa5d-4d81-82d8-c8e73c63ad5d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1826.407599] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1826.407599] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]523d4f0d-022b-7ff1-0d95-9e2fe39b419e" [ 1826.407599] env[68571]: _type = "Task" [ 1826.407599] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1826.415988] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]523d4f0d-022b-7ff1-0d95-9e2fe39b419e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1826.679832] env[68571]: DEBUG nova.network.neutron [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Updated VIF entry in instance network info cache for port a0da8755-47be-4650-aa22-6b837fb291ad. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 1826.680274] env[68571]: DEBUG nova.network.neutron [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Updating instance_info_cache with network_info: [{"id": "a0da8755-47be-4650-aa22-6b837fb291ad", "address": "fa:16:3e:5d:2e:23", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa0da8755-47", "ovs_interfaceid": "a0da8755-47be-4650-aa22-6b837fb291ad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1826.689471] env[68571]: DEBUG oslo_concurrency.lockutils [req-be21e915-9b26-4af1-b6ba-56055e31f70d req-32ea7f42-1397-460d-a266-8a3b1fbebd75 service nova] Releasing lock "refresh_cache-62ce83ad-bb1b-4f78-8d0b-9b516290bac6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1826.918969] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1826.919196] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1826.919425] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1844.489174] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1845.489495] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1845.501692] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.501908] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.502090] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1845.502250] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1845.503370] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49f11ffe-e2b2-4705-afb7-418eec196191 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.512042] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-17d89eb1-7f30-4d42-99b3-5cf3828942e9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.526877] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e64b0eee-5f4d-436c-a5c3-9d001c2cc822 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.533098] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-23693772-0d4a-4669-8125-17e312beb08d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.561500] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180931MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1845.561642] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1845.561830] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1845.634207] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634373] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634493] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634618] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634737] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634854] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.634970] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.635104] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.635219] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.635333] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1845.635515] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 10 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1845.635647] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1792MB phys_disk=200GB used_disk=10GB total_vcpus=48 used_vcpus=10 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1845.749365] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-44b2b456-89c8-4ed7-bc67-55ed7f9aced0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.756921] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7b50d9c4-a15e-4190-b4f6-0f3b126171e3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.786908] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c3398f2-16c7-4d25-a78d-bab7617c7b93 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.793995] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6fa97426-cd00-4390-8edc-dc1e68db2e0a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1845.806799] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1845.814709] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1845.827398] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1845.827586] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.266s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1847.828380] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1847.828703] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1849.484460] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1852.490448] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.490505] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1853.490815] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1853.490815] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1853.510420] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.510591] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.510726] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.510855] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.510978] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511115] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511235] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511352] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511740] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511740] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1853.511740] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1855.489750] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.489518] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1856.489821] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1866.486820] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1869.464204] env[68571]: WARNING oslo_vmware.rw_handles [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1869.464204] env[68571]: ERROR oslo_vmware.rw_handles [ 1869.464932] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1869.466645] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1869.466886] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Copying Virtual Disk [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/a3351f7b-df00-4e4b-8775-ff64be108aa0/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1869.467191] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-60b5728e-a855-4831-885b-9b7a4b30bf27 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1869.475196] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 1869.475196] env[68571]: value = "task-3467755" [ 1869.475196] env[68571]: _type = "Task" [ 1869.475196] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1869.482632] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467755, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1869.986022] env[68571]: DEBUG oslo_vmware.exceptions [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1869.986327] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1869.986888] env[68571]: ERROR nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1869.986888] env[68571]: Faults: ['InvalidArgument'] [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Traceback (most recent call last): [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] yield resources [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self.driver.spawn(context, instance, image_meta, [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self._fetch_image_if_missing(context, vi) [ 1869.986888] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] image_cache(vi, tmp_image_ds_loc) [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] vm_util.copy_virtual_disk( [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] session._wait_for_task(vmdk_copy_task) [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return self.wait_for_task(task_ref) [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return evt.wait() [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] result = hub.switch() [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1869.987383] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return self.greenlet.switch() [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self.f(*self.args, **self.kw) [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] raise exceptions.translate_fault(task_info.error) [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Faults: ['InvalidArgument'] [ 1869.987838] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] [ 1869.987838] env[68571]: INFO nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Terminating instance [ 1869.989288] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1869.989500] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1869.989744] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-2c940943-064b-4f19-a0e7-a8eb44448053 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1869.991895] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1869.992097] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1869.992881] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1edc79e2-c4b6-49e9-9770-b005ef4a7d0a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1869.999441] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1869.999639] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-4506cec7-f3a4-48ef-8ba9-3008f9efdc56 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.001718] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1870.001884] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1870.002821] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-b6a91d58-d31a-43d9-b09d-8efd2fe68bcb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.007189] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Waiting for the task: (returnval){ [ 1870.007189] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d85677-7531-8c21-c1ad-5f79935f9d0c" [ 1870.007189] env[68571]: _type = "Task" [ 1870.007189] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1870.014236] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d85677-7531-8c21-c1ad-5f79935f9d0c, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1870.073934] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1870.074156] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1870.074336] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleting the datastore file [datastore1] 73ba7761-3724-46ed-95c5-e93a6627a2d3 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1870.074600] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-2ad84dd5-b8c7-4620-866e-13e4e80c3dc3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.080312] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 1870.080312] env[68571]: value = "task-3467757" [ 1870.080312] env[68571]: _type = "Task" [ 1870.080312] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1870.087435] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467757, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1870.517449] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1870.517831] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Creating directory with path [datastore1] vmware_temp/3fa4325f-4d5a-4f24-be5a-1f25323cbaa8/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1870.517895] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-99ca2acb-3f8a-4361-8988-0fdd91bf2c08 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.531077] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Created directory with path [datastore1] vmware_temp/3fa4325f-4d5a-4f24-be5a-1f25323cbaa8/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1870.531264] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Fetch image to [datastore1] vmware_temp/3fa4325f-4d5a-4f24-be5a-1f25323cbaa8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1870.531433] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/3fa4325f-4d5a-4f24-be5a-1f25323cbaa8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1870.532161] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7d71ffa4-f6fc-48b5-8f85-a3e31d9e540d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.539176] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-21c5bfea-de73-4e66-8429-940d9cfe5821 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.548256] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8cab8c39-1a8b-47b9-8077-0d8a82145f05 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.578066] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-82847c38-0142-44f2-ac5b-2fba23047da4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.585813] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-83afaf6d-347d-485b-9fb0-65ccd44139e1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.589836] env[68571]: DEBUG oslo_vmware.api [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467757, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.063115} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1870.590346] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1870.590579] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1870.590756] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1870.590927] env[68571]: INFO nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1870.592908] env[68571]: DEBUG nova.compute.claims [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1870.593087] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1870.593313] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1870.610771] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1870.756186] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b3b6a43f-ba24-4553-bd78-8b4debad7239 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.763577] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8a86395f-430f-40b8-8ca5-a10e54606d47 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.793899] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1870.794675] env[68571]: ERROR nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance failed to spawn: nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1870.794675] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1870.795087] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] yield resources [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.driver.spawn(context, instance, image_meta, [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._fetch_image_if_missing(context, vi) [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image_fetch(context, vi, tmp_image_ds_loc) [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] images.fetch_image( [ 1870.795512] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] metadata = IMAGE_API.get(context, image_ref) [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return session.show(context, image_id, [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] _reraise_translated_image_exception(image_id) [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise new_exc.with_traceback(exc_trace) [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1870.795978] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1870.796421] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1870.796874] env[68571]: INFO nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Terminating instance [ 1870.796874] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ae5ebc8c-a775-403d-9e50-8a4af393c70f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.799236] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1870.799435] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1870.800168] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1870.800270] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1870.800485] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8cf29195-423c-4862-9a12-699dd8cccab7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.802695] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7735360f-1093-468c-b307-69b7d6217fd4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.809060] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1870.810693] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-5ae1f6d2-f06d-4291-9b34-9362337d207f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.813679] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7abc92a1-549c-44cd-9d25-3a04c7f255ab {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.817397] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1870.817570] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1870.818496] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-74725694-7a0d-475e-adcb-c1e6747cd6fe {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.829959] env[68571]: DEBUG nova.compute.provider_tree [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1870.832122] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1870.832122] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5224bc95-686e-8c2c-ede9-e3902dd2c28a" [ 1870.832122] env[68571]: _type = "Task" [ 1870.832122] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1870.840076] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1870.840317] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1870.841090] env[68571]: DEBUG nova.scheduler.client.report [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1870.843740] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c9b96107-5b03-4710-a8f2-0d339750cc05 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.856659] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.263s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1870.857216] env[68571]: ERROR nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1870.857216] env[68571]: Faults: ['InvalidArgument'] [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Traceback (most recent call last): [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self.driver.spawn(context, instance, image_meta, [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self._fetch_image_if_missing(context, vi) [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] image_cache(vi, tmp_image_ds_loc) [ 1870.857216] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] vm_util.copy_virtual_disk( [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] session._wait_for_task(vmdk_copy_task) [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return self.wait_for_task(task_ref) [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return evt.wait() [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] result = hub.switch() [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] return self.greenlet.switch() [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1870.857674] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] self.f(*self.args, **self.kw) [ 1870.858077] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1870.858077] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] raise exceptions.translate_fault(task_info.error) [ 1870.858077] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1870.858077] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Faults: ['InvalidArgument'] [ 1870.858077] env[68571]: ERROR nova.compute.manager [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] [ 1870.858077] env[68571]: DEBUG nova.compute.utils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1870.859535] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Build of instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 was re-scheduled: A specified parameter was not correct: fileType [ 1870.859535] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1870.859943] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1870.860151] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1870.860351] env[68571]: DEBUG nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1870.860538] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1870.864836] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1870.865044] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Fetch image to [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1870.865221] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1870.866000] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c3317da5-667d-4dd3-8cc1-f42f5aa804c2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.873275] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8bc1363-e080-4a82-bb65-4547d7dff68d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.876923] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1870.877133] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1870.877310] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Deleting the datastore file [datastore1] d890a035-a14e-4be0-97c8-87edd9bb88e4 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1870.877854] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-e124b847-f4ad-4400-8a1a-ec6c9004f83d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.884945] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3e6fe3eb-cf7f-4a7e-bb0c-fc7b123ee365 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.889761] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Waiting for the task: (returnval){ [ 1870.889761] env[68571]: value = "task-3467759" [ 1870.889761] env[68571]: _type = "Task" [ 1870.889761] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1870.919408] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-204469bd-4331-4685-8dbe-5b1900ae48e3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.924688] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Task: {'id': task-3467759, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1870.927749] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-a7e4006e-7493-4cee-a366-d1bcdb5029c8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1870.956583] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1871.009418] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1871.068165] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1871.068390] env[68571]: DEBUG oslo_vmware.rw_handles [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1871.176463] env[68571]: DEBUG nova.network.neutron [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1871.189929] env[68571]: INFO nova.compute.manager [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Took 0.33 seconds to deallocate network for instance. [ 1871.281337] env[68571]: INFO nova.scheduler.client.report [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleted allocations for instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 [ 1871.304023] env[68571]: DEBUG oslo_concurrency.lockutils [None req-ab8e3b37-4ae8-4215-8970-47f186048ad3 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 659.968s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.304023] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 462.991s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.304023] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1871.304274] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.304274] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.306389] env[68571]: INFO nova.compute.manager [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Terminating instance [ 1871.307762] env[68571]: DEBUG nova.compute.manager [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1871.307955] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1871.308526] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-fd5e1637-8eef-4d36-8568-dd40ca938280 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.317577] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2eebdee0-855d-4e94-9c44-af0795c84cc7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.345514] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 73ba7761-3724-46ed-95c5-e93a6627a2d3 could not be found. [ 1871.345727] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1871.345908] env[68571]: INFO nova.compute.manager [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1871.346343] env[68571]: DEBUG oslo.service.loopingcall [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1871.346431] env[68571]: DEBUG nova.compute.manager [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1871.346489] env[68571]: DEBUG nova.network.neutron [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1871.371053] env[68571]: DEBUG nova.network.neutron [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1871.379166] env[68571]: INFO nova.compute.manager [-] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] Took 0.03 seconds to deallocate network for instance. [ 1871.399326] env[68571]: DEBUG oslo_vmware.api [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Task: {'id': task-3467759, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.064159} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1871.399589] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1871.399772] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1871.399944] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1871.400132] env[68571]: INFO nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Took 0.60 seconds to destroy the instance on the hypervisor. [ 1871.402127] env[68571]: DEBUG nova.compute.claims [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1871.402295] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1871.402457] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.486766] env[68571]: DEBUG oslo_concurrency.lockutils [None req-aea423aa-004d-4278-9d6e-54494a094108 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.183s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.487684] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 149.992s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.488033] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 73ba7761-3724-46ed-95c5-e93a6627a2d3] During sync_power_state the instance has a pending task (deleting). Skip. [ 1871.488236] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "73ba7761-3724-46ed-95c5-e93a6627a2d3" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.001s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.576824] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9763a93a-05cc-457a-842b-051661075a7e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.584819] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-782724ac-dab3-4740-aea9-e265490ae949 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.614451] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-01efddfb-eeb2-4b9a-ba7f-6f542a5e2398 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.621284] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c8699f6-2ead-47ad-a6d4-3cd9ff90ca49 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.633761] env[68571]: DEBUG nova.compute.provider_tree [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1871.642616] env[68571]: DEBUG nova.scheduler.client.report [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1871.655241] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.253s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.655931] env[68571]: ERROR nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Failed to build and run instance: nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1871.655931] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.656332] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.driver.spawn(context, instance, image_meta, [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._fetch_image_if_missing(context, vi) [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image_fetch(context, vi, tmp_image_ds_loc) [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] images.fetch_image( [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] metadata = IMAGE_API.get(context, image_ref) [ 1871.656725] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return session.show(context, image_id, [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] _reraise_translated_image_exception(image_id) [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise new_exc.with_traceback(exc_trace) [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1871.657163] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1871.657683] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.658029] env[68571]: DEBUG nova.compute.utils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1871.658029] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Build of instance d890a035-a14e-4be0-97c8-87edd9bb88e4 was re-scheduled: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1871.658332] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1871.658509] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1871.658670] env[68571]: DEBUG nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1871.658828] env[68571]: DEBUG nova.network.neutron [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1871.762230] env[68571]: DEBUG neutronclient.v2_0.client [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1871.763472] env[68571]: ERROR nova.compute.manager [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Failed to deallocate networks: nova.exception.Unauthorized: Not authorized. [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1871.763472] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] glanceclient.exc.HTTPUnauthorized: HTTP 401 Unauthorized: This server could not verify that you are authorized to access the document you requested. Either you supplied the wrong credentials (e.g., bad password), or your browser does not understand how to supply the credentials required. [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.763801] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.driver.spawn(context, instance, image_meta, [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._fetch_image_if_missing(context, vi) [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 637, in _fetch_image_if_missing [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image_fetch(context, vi, tmp_image_ds_loc) [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 420, in _fetch_image_as_file [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] images.fetch_image( [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/virt/vmwareapi/images.py", line 251, in fetch_image [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] metadata = IMAGE_API.get(context, image_ref) [ 1871.764151] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1205, in get [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return session.show(context, image_id, [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 287, in show [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] _reraise_translated_image_exception(image_id) [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 1031, in _reraise_translated_image_exception [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise new_exc.with_traceback(exc_trace) [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 285, in show [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] image = self._client.call(context, 2, 'get', args=(image_id,)) [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/image/glance.py", line 191, in call [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = getattr(controller, method)(*args, **kwargs) [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 197, in get [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._get(image_id) [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/utils.py", line 649, in inner [ 1871.764566] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return RequestIdProxy(wrapped(*args, **kwargs)) [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/v2/images.py", line 190, in _get [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] resp, body = self.http_client.get(url, headers=header) [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/keystoneauth1/adapter.py", line 395, in get [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.request(url, 'GET', **kwargs) [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 380, in request [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self._handle_response(resp) [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/glanceclient/common/http.py", line 120, in _handle_response [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exc.from_response(resp, resp.content) [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.ImageNotAuthorized: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.764981] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2430, in _do_build_and_run_instance [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._build_and_run_instance(context, instance, image, [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2722, in _build_and_run_instance [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exception.RescheduledException( [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.RescheduledException: Build of instance d890a035-a14e-4be0-97c8-87edd9bb88e4 was re-scheduled: Not authorized for image 6e7bf233-3ffe-4b3b-a510-62353d0292a6. [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1871.765401] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] exception_handler_v20(status_code, error_body) [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise client_exc(message=error_message, [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Neutron server returns request_ids: ['req-7b069cb0-bda6-4a7f-a6f2-2d6d2208567a'] [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3019, in _cleanup_allocated_networks [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._deallocate_network(context, instance, requested_networks) [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.network_api.deallocate_for_instance( [ 1871.765870] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] data = neutron.list_ports(**search_opts) [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.list('ports', self.ports_path, retrieve_all, [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] for r in self._pagination(collection, path, **params): [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] res = self.get(path, params=params) [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.766281] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.retry_request("GET", action, body=body, [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.do_request(method, action, body=body, [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._handle_fault_response(status_code, replybody, resp) [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 204, in wrapper [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exception.Unauthorized() [ 1871.766707] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.Unauthorized: Not authorized. [ 1871.767181] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.816489] env[68571]: INFO nova.scheduler.client.report [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Deleted allocations for instance d890a035-a14e-4be0-97c8-87edd9bb88e4 [ 1871.832868] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bea2a767-7544-46c1-8e54-e40580dcead6 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 620.735s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.833130] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 424.301s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.833348] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Acquiring lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1871.833547] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1871.833710] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1871.835488] env[68571]: INFO nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Terminating instance [ 1871.837193] env[68571]: DEBUG nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1871.837385] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1871.837842] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-90b759b9-a806-4088-adde-34148f908256 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.846085] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cc025f56-34dd-43b4-b400-fbdd010bc5f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1871.875264] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance d890a035-a14e-4be0-97c8-87edd9bb88e4 could not be found. [ 1871.875460] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1871.875634] env[68571]: INFO nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1871.875870] env[68571]: DEBUG oslo.service.loopingcall [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1871.876108] env[68571]: DEBUG nova.compute.manager [-] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1871.876206] env[68571]: DEBUG nova.network.neutron [-] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1871.965688] env[68571]: DEBUG neutronclient.v2_0.client [-] Error message: {"error": {"code": 401, "title": "Unauthorized", "message": "The request you have made requires authentication."}} {{(pid=68571) _handle_fault_response /opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py:262}} [ 1871.965951] env[68571]: ERROR nova.network.neutron [-] Neutron client was not able to generate a valid admin token, please verify Neutron admin credential located in nova.conf: neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall [-] Dynamic interval looping call 'oslo_service.loopingcall.RetryDecorator.__call__.._func' failed: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall exception_handler_v20(status_code, error_body) [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall raise client_exc(message=error_message, [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall Neutron server returns request_ids: ['req-f3c94ace-ec77-4736-81f6-a6239fed8556'] [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall During handling of the above exception, another exception occurred: [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall Traceback (most recent call last): [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1871.966554] env[68571]: ERROR oslo.service.loopingcall result = func(*self.args, **self.kw) [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall result = f(*args, **kwargs) [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall self._deallocate_network( [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall self.network_api.deallocate_for_instance( [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall data = neutron.list_ports(**search_opts) [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall return self.list('ports', self.ports_path, retrieve_all, [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall for r in self._pagination(collection, path, **params): [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1871.967347] env[68571]: ERROR oslo.service.loopingcall res = self.get(path, params=params) [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall return self.retry_request("GET", action, body=body, [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall return self.do_request(method, action, body=body, [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall ret = obj(*args, **kwargs) [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall self._handle_fault_response(status_code, replybody, resp) [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1871.968360] env[68571]: ERROR oslo.service.loopingcall [ 1871.968969] env[68571]: ERROR nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Failed to deallocate network for instance. Error: Networking client is experiencing an unauthorized exception.: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1871.996232] env[68571]: ERROR nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Setting instance vm_state to ERROR: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] exception_handler_v20(status_code, error_body) [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise client_exc(message=error_message, [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Neutron server returns request_ids: ['req-f3c94ace-ec77-4736-81f6-a6239fed8556'] [ 1871.996232] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During handling of the above exception, another exception occurred: [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Traceback (most recent call last): [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._delete_instance(context, instance, bdms) [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._shutdown_instance(context, instance, bdms) [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._try_deallocate_network(context, instance, requested_networks) [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] with excutils.save_and_reraise_exception(): [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1871.996755] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.force_reraise() [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise self.value [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] _deallocate_network_with_retries() [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return evt.wait() [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = hub.switch() [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.greenlet.switch() [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1871.997217] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = func(*self.args, **self.kw) [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] result = f(*args, **kwargs) [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._deallocate_network( [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self.network_api.deallocate_for_instance( [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] data = neutron.list_ports(**search_opts) [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.list('ports', self.ports_path, retrieve_all, [ 1871.997634] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] for r in self._pagination(collection, path, **params): [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] res = self.get(path, params=params) [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.retry_request("GET", action, body=body, [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1871.998152] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] return self.do_request(method, action, body=body, [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] ret = obj(*args, **kwargs) [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] self._handle_fault_response(status_code, replybody, resp) [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1871.998570] env[68571]: ERROR nova.compute.manager [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] [ 1872.026195] env[68571]: DEBUG oslo_concurrency.lockutils [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.193s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1872.027271] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 150.531s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1872.027448] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] During sync_power_state the instance has a pending task (deleting). Skip. [ 1872.027619] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "d890a035-a14e-4be0-97c8-87edd9bb88e4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1872.074247] env[68571]: INFO nova.compute.manager [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] [instance: d890a035-a14e-4be0-97c8-87edd9bb88e4] Successfully reverted task state from None on failure for instance. [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server [None req-4c67d5ab-62d8-4f67-ba30-d4d5d3ffa7d0 tempest-ListImageFiltersTestJSON-1592965173 tempest-ListImageFiltersTestJSON-1592965173-project-member] Exception during message handling: nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 272, in _handle_fault_response [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server exception_handler_v20(status_code, error_body) [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 90, in exception_handler_v20 [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server raise client_exc(message=error_message, [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server neutronclient.common.exceptions.Unauthorized: 401-{'error': {'code': 401, 'title': 'Unauthorized', 'message': 'The request you have made requires authentication.'}} [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server Neutron server returns request_ids: ['req-f3c94ace-ec77-4736-81f6-a6239fed8556'] [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server Traceback (most recent call last): [ 1872.077590] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message) [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args) [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args) [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server return f(self, context, *args, **kw) [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1872.078228] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server return function(self, context, *args, **kwargs) [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3327, in terminate_instance [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server do_terminate_instance(instance, bdms) [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py", line 414, in inner [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server return f(*args, **kwargs) [ 1872.078833] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3322, in do_terminate_instance [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3315, in do_terminate_instance [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server self._delete_instance(context, instance, bdms) [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3250, in _delete_instance [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server self._shutdown_instance(context, instance, bdms) [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3144, in _shutdown_instance [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server self._try_deallocate_network(context, instance, requested_networks) [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3058, in _try_deallocate_network [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception(): [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__ [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server self.force_reraise() [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server raise self.value [ 1872.079419] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3056, in _try_deallocate_network [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server _deallocate_network_with_retries() [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 436, in func [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server return evt.wait() [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server result = hub.switch() [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server return self.greenlet.switch() [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 150, in _run_loop [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server result = func(*self.args, **self.kw) [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py", line 407, in _func [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server result = f(*args, **kwargs) [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 3045, in _deallocate_network_with_retries [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server self._deallocate_network( [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/compute/manager.py", line 2265, in _deallocate_network [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server self.network_api.deallocate_for_instance( [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 1806, in deallocate_for_instance [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server data = neutron.list_ports(**search_opts) [ 1872.079996] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 815, in list_ports [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server return self.list('ports', self.ports_path, retrieve_all, [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 372, in list [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server for r in self._pagination(collection, path, **params): [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 387, in _pagination [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server res = self.get(path, params=params) [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 356, in get [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server return self.retry_request("GET", action, body=body, [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 333, in retry_request [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server return self.do_request(method, action, body=body, [ 1872.080559] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 196, in wrapper [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server ret = obj(*args, **kwargs) [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/neutronclient/v2_0/client.py", line 297, in do_request [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server self._handle_fault_response(status_code, replybody, resp) [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server File "/opt/stack/nova/nova/network/neutron.py", line 212, in wrapper [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server raise exception.NeutronAdminCredentialConfigurationInvalid() [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server nova.exception.NeutronAdminCredentialConfigurationInvalid: Networking client is experiencing an unauthorized exception. [ 1872.081175] env[68571]: ERROR oslo_messaging.rpc.server [ 1905.489017] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1905.500256] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1905.500491] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1905.500656] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1905.500812] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1905.501983] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-50717331-4de3-444a-b895-9bd21bb4e793 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.510991] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-584e8e12-e225-42f0-9ccf-477724ab6fba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.528207] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b014548e-6481-4bb8-810a-9ca85f470819 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.534893] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f2aada0b-c78e-405f-87e5-f777500f21f3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.564925] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180926MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1905.565078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1905.565270] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1905.632584] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 9e8c8d14-144f-42e3-8556-796651b7b04f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.632744] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.632873] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633008] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633163] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633277] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633382] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633498] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1905.633683] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1905.633819] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1905.724333] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-49dc506e-de4c-427f-a8fd-6a422c91e0cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.731991] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a9a40a3b-a756-49aa-9f06-a9a63ae41c42 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.761261] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd5a801b-6880-48b7-8da7-ffc49c57dd14 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.767671] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28474296-8dce-4e4e-baed-2f6e32556cd2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1905.780185] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1905.788855] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1905.800560] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1905.800739] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.235s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1906.801412] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1907.490393] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1909.489820] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1911.485240] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1912.489401] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.489616] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1914.489987] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1914.489987] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1914.507755] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.507957] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508143] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508305] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508461] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508615] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508764] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.508920] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1914.509054] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1916.489863] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1916.829617] env[68571]: WARNING oslo_vmware.rw_handles [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1916.829617] env[68571]: ERROR oslo_vmware.rw_handles [ 1916.830127] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1916.832546] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1916.832792] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Copying Virtual Disk [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/78f42d77-be26-4335-b064-1bd98b51ec78/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1916.833098] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-cd1d9572-7ef4-4e1e-a7a5-5e2888ac2951 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1916.841679] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1916.841679] env[68571]: value = "task-3467760" [ 1916.841679] env[68571]: _type = "Task" [ 1916.841679] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1916.849991] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467760, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1917.351832] env[68571]: DEBUG oslo_vmware.exceptions [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1917.352118] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1917.352689] env[68571]: ERROR nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1917.352689] env[68571]: Faults: ['InvalidArgument'] [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Traceback (most recent call last): [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] yield resources [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self.driver.spawn(context, instance, image_meta, [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self._fetch_image_if_missing(context, vi) [ 1917.352689] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] image_cache(vi, tmp_image_ds_loc) [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] vm_util.copy_virtual_disk( [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] session._wait_for_task(vmdk_copy_task) [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return self.wait_for_task(task_ref) [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return evt.wait() [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] result = hub.switch() [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1917.353339] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return self.greenlet.switch() [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self.f(*self.args, **self.kw) [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] raise exceptions.translate_fault(task_info.error) [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Faults: ['InvalidArgument'] [ 1917.353723] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] [ 1917.353723] env[68571]: INFO nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Terminating instance [ 1917.354548] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1917.354762] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1917.355011] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-80f0df5a-cfe9-45c1-a286-23041558d2e4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.357370] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1917.357560] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1917.358278] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bd9c3cc7-f734-40ac-bb47-8a395fc048be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.364756] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1917.364961] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-9ffe7c11-ede2-4f9f-a563-0a5ff40db512 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.366998] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1917.367184] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1917.368115] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-410e35e4-3496-4ffe-af2c-b92ca8270d3f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.372833] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for the task: (returnval){ [ 1917.372833] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]529e5824-e9c3-341c-d4aa-9716c1e13870" [ 1917.372833] env[68571]: _type = "Task" [ 1917.372833] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1917.386210] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1917.386427] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Creating directory with path [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1917.386631] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5b7dbf58-0566-4b5d-a747-adc4d6da4c40 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.406603] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Created directory with path [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1917.406787] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Fetch image to [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1917.406953] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1917.407667] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a1e40725-fc04-4196-9605-bf01955270c1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.414337] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1cc6dc69-82b1-41a2-b9fc-e3599d73f74d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.423246] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e46a2e2-abe6-4765-aeec-a519f9d5d064 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.455571] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6f76aa94-38c9-466b-a204-dd23616ea5a8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.457965] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1917.458172] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1917.458347] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleting the datastore file [datastore1] 9e8c8d14-144f-42e3-8556-796651b7b04f {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1917.458562] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0404fc9e-7de5-4e0a-b9fd-7e68ca0a1d0e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.465025] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-91e336c2-4fed-473b-b781-d5803a4be5bc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1917.466812] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 1917.466812] env[68571]: value = "task-3467762" [ 1917.466812] env[68571]: _type = "Task" [ 1917.466812] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1917.487139] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1917.537147] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1917.600267] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1917.600542] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1917.977362] env[68571]: DEBUG oslo_vmware.api [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467762, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077028} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1917.977624] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1917.977771] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1917.977942] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1917.978129] env[68571]: INFO nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Took 0.62 seconds to destroy the instance on the hypervisor. [ 1917.980344] env[68571]: DEBUG nova.compute.claims [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1917.980522] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1917.980722] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1918.110588] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c13a7da0-42b3-4d4b-a69c-1a0e6d4ebb3c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.117901] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e5b79d8c-0c44-4721-ba93-29f8c1e146e4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.146632] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9af153fa-73e1-414d-b970-8713a4ed7e76 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.154446] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f7e37c35-c2d8-4f5f-adf0-7a73af1e5bf5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.167707] env[68571]: DEBUG nova.compute.provider_tree [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1918.175973] env[68571]: DEBUG nova.scheduler.client.report [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1918.191054] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.210s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1918.191585] env[68571]: ERROR nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1918.191585] env[68571]: Faults: ['InvalidArgument'] [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Traceback (most recent call last): [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self.driver.spawn(context, instance, image_meta, [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self._fetch_image_if_missing(context, vi) [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] image_cache(vi, tmp_image_ds_loc) [ 1918.191585] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] vm_util.copy_virtual_disk( [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] session._wait_for_task(vmdk_copy_task) [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return self.wait_for_task(task_ref) [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return evt.wait() [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] result = hub.switch() [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] return self.greenlet.switch() [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1918.191973] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] self.f(*self.args, **self.kw) [ 1918.192364] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1918.192364] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] raise exceptions.translate_fault(task_info.error) [ 1918.192364] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1918.192364] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Faults: ['InvalidArgument'] [ 1918.192364] env[68571]: ERROR nova.compute.manager [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] [ 1918.192364] env[68571]: DEBUG nova.compute.utils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1918.193672] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Build of instance 9e8c8d14-144f-42e3-8556-796651b7b04f was re-scheduled: A specified parameter was not correct: fileType [ 1918.193672] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1918.194073] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1918.194262] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1918.194439] env[68571]: DEBUG nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1918.194600] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1918.489769] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1918.489769] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1918.496876] env[68571]: DEBUG nova.network.neutron [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1918.509079] env[68571]: INFO nova.compute.manager [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Took 0.31 seconds to deallocate network for instance. [ 1918.609675] env[68571]: INFO nova.scheduler.client.report [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted allocations for instance 9e8c8d14-144f-42e3-8556-796651b7b04f [ 1918.632190] env[68571]: DEBUG oslo_concurrency.lockutils [None req-1dadec66-da07-45bc-a6ab-1c39d07dd3ae tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 561.683s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1918.632412] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 366.365s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1918.632888] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1918.632888] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1918.633056] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1918.634904] env[68571]: INFO nova.compute.manager [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Terminating instance [ 1918.636740] env[68571]: DEBUG nova.compute.manager [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1918.636987] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1918.637394] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-f3b6a81b-8efa-4e23-a4a1-6f0905cad32a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.648020] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-befc189b-1c54-465d-93f8-563f593544e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1918.672874] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 9e8c8d14-144f-42e3-8556-796651b7b04f could not be found. [ 1918.673084] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1918.673269] env[68571]: INFO nova.compute.manager [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1918.673503] env[68571]: DEBUG oslo.service.loopingcall [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1918.673975] env[68571]: DEBUG nova.compute.manager [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1918.674101] env[68571]: DEBUG nova.network.neutron [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1918.696700] env[68571]: DEBUG nova.network.neutron [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1918.705204] env[68571]: INFO nova.compute.manager [-] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] Took 0.03 seconds to deallocate network for instance. [ 1918.817301] env[68571]: DEBUG oslo_concurrency.lockutils [None req-6abd1a94-7c8e-4161-9109-7d166f9dd86a tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.185s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1918.818136] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 197.322s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1918.818257] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 9e8c8d14-144f-42e3-8556-796651b7b04f] During sync_power_state the instance has a pending task (deleting). Skip. [ 1918.818427] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "9e8c8d14-144f-42e3-8556-796651b7b04f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1944.418265] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1944.602232] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1944.602484] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1944.611359] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1944.658081] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1944.658328] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1944.659798] env[68571]: INFO nova.compute.claims [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1944.795018] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-696150c5-c658-4d50-8cc5-cba7f93467b7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.800827] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-615af01f-1423-47e2-ad3e-acd851cfd4f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.830115] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-132ccb32-0382-4690-ae72-ba68fb25fc8c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.837032] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-76303fd8-b45e-4c59-baf8-722116e2acae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1944.849541] env[68571]: DEBUG nova.compute.provider_tree [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1944.860648] env[68571]: DEBUG nova.scheduler.client.report [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1944.873340] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1944.873788] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1944.907413] env[68571]: DEBUG nova.compute.utils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1944.908542] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Not allocating networking since 'none' was specified. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1944.916840] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1944.975384] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1944.999896] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1945.000138] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1945.000296] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1945.000477] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1945.000621] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1945.000764] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1945.000966] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1945.001137] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1945.001351] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1945.001517] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1945.001685] env[68571]: DEBUG nova.virt.hardware [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1945.002560] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-643297b4-3c1f-402d-9029-abd635af2e9c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.010028] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8b5e8a84-8030-4df4-8a7c-c46fc7f510f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.023042] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance VIF info [] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1945.028453] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Creating folder: Project (77ea2ca3389049ab94cf2dae1002d4bc). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1945.028701] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-e6430a4b-c26a-42b8-8afc-d66b96aa4d34 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.037543] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Created folder: Project (77ea2ca3389049ab94cf2dae1002d4bc) in parent group-v692787. [ 1945.037720] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Creating folder: Instances. Parent ref: group-v692891. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1945.037919] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-f48e1f46-5e98-4e40-89e1-218f822f2d18 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.045032] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Created folder: Instances in parent group-v692891. [ 1945.045252] env[68571]: DEBUG oslo.service.loopingcall [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1945.045427] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1945.045610] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-0e0e3299-45be-47aa-af30-26a0928e7ff6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.060684] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1945.060684] env[68571]: value = "task-3467765" [ 1945.060684] env[68571]: _type = "Task" [ 1945.060684] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.067473] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467765, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1945.570210] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467765, 'name': CreateVM_Task, 'duration_secs': 0.246345} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1945.570514] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1945.570795] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1945.570958] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1945.571309] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1945.571580] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-eedd80fb-8ab8-40d2-9d81-980bcdf890fb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1945.575797] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for the task: (returnval){ [ 1945.575797] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52f4fec3-6346-bde4-61e7-0de0e63984de" [ 1945.575797] env[68571]: _type = "Task" [ 1945.575797] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1945.583502] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52f4fec3-6346-bde4-61e7-0de0e63984de, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1946.086508] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1946.086749] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1946.087032] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1963.594540] env[68571]: WARNING oslo_vmware.rw_handles [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 1963.594540] env[68571]: ERROR oslo_vmware.rw_handles [ 1963.594540] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 1963.600428] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 1963.600428] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Copying Virtual Disk [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/0d5d51b9-7dae-43cb-a091-c8cd5689cf87/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 1963.600428] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-23f9dfa0-0fa4-4bf1-9f30-ee3599942486 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1963.610018] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for the task: (returnval){ [ 1963.610018] env[68571]: value = "task-3467766" [ 1963.610018] env[68571]: _type = "Task" [ 1963.610018] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1963.619035] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Task: {'id': task-3467766, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1964.118473] env[68571]: DEBUG oslo_vmware.exceptions [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 1964.118767] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1964.119449] env[68571]: ERROR nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1964.119449] env[68571]: Faults: ['InvalidArgument'] [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Traceback (most recent call last): [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] yield resources [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self.driver.spawn(context, instance, image_meta, [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self._fetch_image_if_missing(context, vi) [ 1964.119449] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] image_cache(vi, tmp_image_ds_loc) [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] vm_util.copy_virtual_disk( [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] session._wait_for_task(vmdk_copy_task) [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return self.wait_for_task(task_ref) [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return evt.wait() [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] result = hub.switch() [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1964.119853] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return self.greenlet.switch() [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self.f(*self.args, **self.kw) [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] raise exceptions.translate_fault(task_info.error) [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Faults: ['InvalidArgument'] [ 1964.120313] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] [ 1964.120313] env[68571]: INFO nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Terminating instance [ 1964.121413] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1964.121665] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1964.121905] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-a7c6b76e-393d-467b-a2d3-ca72e4885cd6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.124158] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1964.124359] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1964.125090] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-28066a62-cd1f-45ee-8564-bd6511b4c5fc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.133527] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 1964.133815] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-b092c58a-0473-44a1-965e-7fa4adb60043 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.135955] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1964.136143] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 1964.137077] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-7b8ef4f7-7e43-4a9d-9325-ef24b1ae8515 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.142201] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 1964.142201] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]528a4250-5cdb-2b33-b3a2-c47c9e2b0082" [ 1964.142201] env[68571]: _type = "Task" [ 1964.142201] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1964.149542] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]528a4250-5cdb-2b33-b3a2-c47c9e2b0082, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1964.220858] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 1964.221207] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 1964.221405] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Deleting the datastore file [datastore1] 1f8dd053-ebd8-4ad9-a607-ab364a3320ca {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 1964.221703] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-c3b363f9-3468-4701-b57d-d99ed5893756 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.228808] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for the task: (returnval){ [ 1964.228808] env[68571]: value = "task-3467768" [ 1964.228808] env[68571]: _type = "Task" [ 1964.228808] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1964.236576] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Task: {'id': task-3467768, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1964.653419] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 1964.653785] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating directory with path [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 1964.654018] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-1b413bc8-e06b-42ea-a872-6dbdd8bd84a3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.665725] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Created directory with path [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 1964.665928] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Fetch image to [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 1964.666113] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 1964.666841] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de447a38-52cc-41ac-87fe-9d1b117f7965 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.673294] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dd98c563-b1f4-4e93-8c03-0d23c89b1e09 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.682717] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-96d918fc-03a9-4759-bcf4-55938d9f0c70 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.715173] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6561ac44-55e8-4c96-8650-5dfe842ae65e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.721433] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-3380774c-2bf4-48b3-89c2-60a2feef7b8c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.737109] env[68571]: DEBUG oslo_vmware.api [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Task: {'id': task-3467768, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.070357} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1964.737344] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 1964.737534] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 1964.737701] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1964.738454] env[68571]: INFO nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Took 0.61 seconds to destroy the instance on the hypervisor. [ 1964.740636] env[68571]: DEBUG nova.compute.claims [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 1964.740636] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1964.740636] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1964.744840] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 1964.930056] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61925577-70cb-4be3-a796-8d0070d2825f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.934116] env[68571]: DEBUG oslo_vmware.rw_handles [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 1964.991499] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-41847b13-9276-4e7f-865b-c85f234dec13 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1964.998559] env[68571]: DEBUG oslo_vmware.rw_handles [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 1964.998773] env[68571]: DEBUG oslo_vmware.rw_handles [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Closing write handle for https://esx7c1n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 1965.025036] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0ac07f87-7f55-485c-8eb0-ae0e27121a1e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.032289] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5829da46-ba45-4575-bdb8-cf9cc8365d46 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.045044] env[68571]: DEBUG nova.compute.provider_tree [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1965.053274] env[68571]: DEBUG nova.scheduler.client.report [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1965.068436] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.328s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1965.068962] env[68571]: ERROR nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1965.068962] env[68571]: Faults: ['InvalidArgument'] [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Traceback (most recent call last): [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self.driver.spawn(context, instance, image_meta, [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self._vmops.spawn(context, instance, image_meta, injected_files, [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self._fetch_image_if_missing(context, vi) [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] image_cache(vi, tmp_image_ds_loc) [ 1965.068962] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] vm_util.copy_virtual_disk( [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] session._wait_for_task(vmdk_copy_task) [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return self.wait_for_task(task_ref) [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return evt.wait() [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] result = hub.switch() [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] return self.greenlet.switch() [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 1965.069434] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] self.f(*self.args, **self.kw) [ 1965.070048] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 1965.070048] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] raise exceptions.translate_fault(task_info.error) [ 1965.070048] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 1965.070048] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Faults: ['InvalidArgument'] [ 1965.070048] env[68571]: ERROR nova.compute.manager [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] [ 1965.070587] env[68571]: DEBUG nova.compute.utils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 1965.072225] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Build of instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca was re-scheduled: A specified parameter was not correct: fileType [ 1965.072225] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 1965.072568] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 1965.072750] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 1965.072930] env[68571]: DEBUG nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1965.073112] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1965.359309] env[68571]: DEBUG nova.network.neutron [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1965.371184] env[68571]: INFO nova.compute.manager [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Took 0.30 seconds to deallocate network for instance. [ 1965.468631] env[68571]: INFO nova.scheduler.client.report [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Deleted allocations for instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca [ 1965.492064] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e83b5fbd-297d-419c-9cd3-96c4d5a7178d tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 596.710s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1965.492283] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 400.788s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1965.492503] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Acquiring lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1965.492710] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1965.492874] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1965.494768] env[68571]: INFO nova.compute.manager [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Terminating instance [ 1965.496409] env[68571]: DEBUG nova.compute.manager [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 1965.496602] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 1965.497087] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-10a77f3b-dacd-401d-888c-e18a9d06195c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.505763] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-bdfe7974-7351-413e-b26e-27657416a5e7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1965.534169] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 1f8dd053-ebd8-4ad9-a607-ab364a3320ca could not be found. [ 1965.534386] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 1965.534563] env[68571]: INFO nova.compute.manager [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Took 0.04 seconds to destroy the instance on the hypervisor. [ 1965.534814] env[68571]: DEBUG oslo.service.loopingcall [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1965.535054] env[68571]: DEBUG nova.compute.manager [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 1965.535153] env[68571]: DEBUG nova.network.neutron [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 1965.560798] env[68571]: DEBUG nova.network.neutron [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1965.568758] env[68571]: INFO nova.compute.manager [-] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] Took 0.03 seconds to deallocate network for instance. [ 1965.663408] env[68571]: DEBUG oslo_concurrency.lockutils [None req-fd85a834-21e2-40a1-abc0-4151f4a7065c tempest-ServerRescueTestJSON-1944435068 tempest-ServerRescueTestJSON-1944435068-project-member] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1965.664361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 244.168s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1965.664559] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1f8dd053-ebd8-4ad9-a607-ab364a3320ca] During sync_power_state the instance has a pending task (deleting). Skip. [ 1965.664870] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "1f8dd053-ebd8-4ad9-a607-ab364a3320ca" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1967.490081] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1967.501762] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1967.501975] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1967.502156] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1967.502311] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 1967.503413] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cdb12d89-3421-4107-be57-0e69ccc2b0dd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.513036] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d063cae0-0c4d-469e-a457-ffc83193e7ba {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.527748] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8fb2a923-a30f-47bd-820a-57ff1d658667 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.534562] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7483bc7-27a6-4cdf-ae32-114a9ba96f03 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.563287] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180930MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 1967.563457] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1967.563611] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1967.625224] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625398] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625509] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625630] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625752] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625888] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.625986] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 1967.626177] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 1967.626312] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 1967.710927] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f1c1c518-604c-4ab3-a93e-b4b997bb33bb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.717998] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d1145983-ebaa-4281-8f7d-3358009d9306 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.746935] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-508404b1-3952-4bb3-9d7b-c48172997743 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.753882] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebc1359c-2d51-4591-ae2d-86aed53c6364 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1967.766420] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1967.775093] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1967.788409] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 1967.788586] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1968.788273] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1968.788757] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1971.489665] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1972.489593] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1973.485236] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1974.655889] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1974.656226] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1974.665851] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1974.710671] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1974.710909] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1974.712332] env[68571]: INFO nova.compute.claims [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1974.841805] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7a8cb63f-68aa-4e52-a07d-b83cdf43415b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.849499] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f24c0c6-26a3-4102-b937-7a9a18856a9b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.878265] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2531c893-190b-479b-bbd1-29d515583773 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.884893] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d3d7bd36-30c8-44f2-b1f6-612851691e16 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1974.897653] env[68571]: DEBUG nova.compute.provider_tree [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1974.905832] env[68571]: DEBUG nova.scheduler.client.report [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1974.921508] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1974.922052] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1974.955117] env[68571]: DEBUG nova.compute.utils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1974.956358] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Not allocating networking since 'none' was specified. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1952}} [ 1974.964948] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1975.025343] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1975.052617] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1975.052861] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1975.053025] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1975.053206] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1975.053349] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1975.053492] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1975.053691] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1975.053847] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1975.054051] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1975.054218] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1975.054389] env[68571]: DEBUG nova.virt.hardware [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1975.055225] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9ac762e8-ba86-4e26-86e7-8f7eb04e3f15 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.063070] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d8618ea5-7d46-4a0d-ae1b-e088e2e933ce {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.076242] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance VIF info [] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1975.081643] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Creating folder: Project (8abbb9e2c26a41f0b3f222b96352f3db). Parent ref: group-v692787. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1975.081909] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-c1bb827c-f1a8-4db3-9a73-323c5a5cd1a8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.091414] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Created folder: Project (8abbb9e2c26a41f0b3f222b96352f3db) in parent group-v692787. [ 1975.091622] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Creating folder: Instances. Parent ref: group-v692894. {{(pid=68571) create_folder /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1589}} [ 1975.091809] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateFolder with opID=oslo.vmware-fbd55a6b-ce37-4646-918d-306a86b90182 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.099901] env[68571]: INFO nova.virt.vmwareapi.vm_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Created folder: Instances in parent group-v692894. [ 1975.100130] env[68571]: DEBUG oslo.service.loopingcall [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1975.100306] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1975.100486] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-760d8c3d-15d8-4769-ab67-accf0ac1533b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.115919] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1975.115919] env[68571]: value = "task-3467771" [ 1975.115919] env[68571]: _type = "Task" [ 1975.115919] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1975.122747] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467771, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1975.490030] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1975.490030] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 1975.490030] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 1975.508823] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509027] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509173] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509306] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509438] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509592] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509721] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509844] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 1975.509965] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 1975.625658] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467771, 'name': CreateVM_Task, 'duration_secs': 0.234827} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 1975.625830] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 1975.626261] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1975.626422] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1975.626746] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 1975.627028] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-172a2248-119b-4623-a236-fb001e6b7fd0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1975.631286] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for the task: (returnval){ [ 1975.631286] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52f81632-6e8e-6ace-02a0-314fca3eb844" [ 1975.631286] env[68571]: _type = "Task" [ 1975.631286] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1975.638747] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52f81632-6e8e-6ace-02a0-314fca3eb844, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 1976.142565] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1976.142920] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 1976.143069] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1977.489363] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1980.489528] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1980.489950] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 1990.485869] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 1998.297650] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "17530424-18ad-4713-ae56-acbe585bd5d9" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1998.297959] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "17530424-18ad-4713-ae56-acbe585bd5d9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.308778] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 1998.353767] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1998.354010] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1998.355357] env[68571]: INFO nova.compute.claims [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 1998.502474] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9370e75f-57b9-4bc4-8a39-0cc9ccc0d4e6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.510119] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c1bbafa2-b93b-4d45-8abb-c7c88e8db471 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.539300] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea498a39-12b9-49c9-b1db-23be643eb63c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.545912] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-de158398-96ef-442b-ae14-9153a7240f46 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.558515] env[68571]: DEBUG nova.compute.provider_tree [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 1998.567907] env[68571]: DEBUG nova.scheduler.client.report [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 1998.580098] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1998.580543] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 1998.611156] env[68571]: DEBUG nova.compute.utils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 1998.612420] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 1998.612588] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 1998.621260] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 1998.681215] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 1998.689111] env[68571]: DEBUG nova.policy [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'b21fda9650f1447a81a5994f05fc8078', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '157830f5757b429383d95b2b4c0a384c', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 1998.704083] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 1998.704320] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 1998.704473] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 1998.704647] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 1998.704790] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 1998.704935] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 1998.705151] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 1998.705310] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 1998.705471] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 1998.705632] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 1998.705797] env[68571]: DEBUG nova.virt.hardware [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 1998.706693] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c036ce55-6f85-4405-8c91-8010478ab4af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.714696] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ba6d99dc-2410-43fd-9d16-a4c1f192cfdf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1998.984759] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Successfully created port: ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 1999.549358] env[68571]: DEBUG nova.compute.manager [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Received event network-vif-plugged-ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 1999.549618] env[68571]: DEBUG oslo_concurrency.lockutils [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] Acquiring lock "17530424-18ad-4713-ae56-acbe585bd5d9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 1999.549793] env[68571]: DEBUG oslo_concurrency.lockutils [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] Lock "17530424-18ad-4713-ae56-acbe585bd5d9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 1999.550083] env[68571]: DEBUG oslo_concurrency.lockutils [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] Lock "17530424-18ad-4713-ae56-acbe585bd5d9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 1999.550285] env[68571]: DEBUG nova.compute.manager [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] No waiting events found dispatching network-vif-plugged-ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 1999.550453] env[68571]: WARNING nova.compute.manager [req-a5113020-fca5-4a43-806f-6e4540b9e224 req-bf611fdf-c63a-4f58-9295-e1e5469767e2 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Received unexpected event network-vif-plugged-ba46b3fb-5594-4dae-9cf8-4931d819c381 for instance with vm_state building and task_state spawning. [ 1999.618196] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Successfully updated port: ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 1999.629111] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 1999.629300] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 1999.629457] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 1999.668722] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 1999.817720] env[68571]: DEBUG nova.network.neutron [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Updating instance_info_cache with network_info: [{"id": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "address": "fa:16:3e:0e:1e:63", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba46b3fb-55", "ovs_interfaceid": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 1999.828420] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 1999.828720] env[68571]: DEBUG nova.compute.manager [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Instance network_info: |[{"id": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "address": "fa:16:3e:0e:1e:63", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba46b3fb-55", "ovs_interfaceid": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 1999.829130] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:0e:1e:63', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '8c979f78-8597-41f8-b1de-995014032689', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'ba46b3fb-5594-4dae-9cf8-4931d819c381', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 1999.836894] env[68571]: DEBUG oslo.service.loopingcall [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 1999.837369] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 1999.837604] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-1a8f6abc-b372-428f-895f-8b8c3ff86b3f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 1999.857993] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 1999.857993] env[68571]: value = "task-3467772" [ 1999.857993] env[68571]: _type = "Task" [ 1999.857993] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 1999.865844] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467772, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.368811] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467772, 'name': CreateVM_Task, 'duration_secs': 0.285967} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2000.368974] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2000.375954] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2000.376142] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2000.376450] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2000.376694] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-4c45c062-def4-46dc-ab84-1c15784610ec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2000.380903] env[68571]: DEBUG oslo_vmware.api [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 2000.380903] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d5df05-0b8c-9a15-624b-3ac0bf1b1731" [ 2000.380903] env[68571]: _type = "Task" [ 2000.380903] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2000.388218] env[68571]: DEBUG oslo_vmware.api [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52d5df05-0b8c-9a15-624b-3ac0bf1b1731, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2000.890836] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2000.891228] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2000.891286] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2001.573283] env[68571]: DEBUG nova.compute.manager [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Received event network-changed-ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2001.573478] env[68571]: DEBUG nova.compute.manager [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Refreshing instance network info cache due to event network-changed-ba46b3fb-5594-4dae-9cf8-4931d819c381. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2001.573686] env[68571]: DEBUG oslo_concurrency.lockutils [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] Acquiring lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2001.573829] env[68571]: DEBUG oslo_concurrency.lockutils [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] Acquired lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2001.573988] env[68571]: DEBUG nova.network.neutron [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Refreshing network info cache for port ba46b3fb-5594-4dae-9cf8-4931d819c381 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2001.796501] env[68571]: DEBUG nova.network.neutron [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Updated VIF entry in instance network info cache for port ba46b3fb-5594-4dae-9cf8-4931d819c381. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2001.796835] env[68571]: DEBUG nova.network.neutron [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Updating instance_info_cache with network_info: [{"id": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "address": "fa:16:3e:0e:1e:63", "network": {"id": "375189f9-b770-49f9-a6e3-f686fe031694", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-452330210-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "157830f5757b429383d95b2b4c0a384c", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "8c979f78-8597-41f8-b1de-995014032689", "external-id": "nsx-vlan-transportzone-477", "segmentation_id": 477, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapba46b3fb-55", "ovs_interfaceid": "ba46b3fb-5594-4dae-9cf8-4931d819c381", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2001.806704] env[68571]: DEBUG oslo_concurrency.lockutils [req-78460257-3372-46df-816a-d24173ac4f0c req-b113c93d-a782-4fa6-a0c6-07e38eb80bd3 service nova] Releasing lock "refresh_cache-17530424-18ad-4713-ae56-acbe585bd5d9" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2011.858484] env[68571]: WARNING oslo_vmware.rw_handles [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2011.858484] env[68571]: ERROR oslo_vmware.rw_handles [ 2011.859033] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2011.861168] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2011.861446] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Copying Virtual Disk [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/fc3c914c-b348-42a1-8f96-0f82ef069a61/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2011.861738] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-826fe824-bc2b-4e51-ae88-0f86d28704e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2011.870052] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 2011.870052] env[68571]: value = "task-3467773" [ 2011.870052] env[68571]: _type = "Task" [ 2011.870052] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2011.876857] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467773, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2012.380271] env[68571]: DEBUG oslo_vmware.exceptions [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2012.380551] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2012.381111] env[68571]: ERROR nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2012.381111] env[68571]: Faults: ['InvalidArgument'] [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Traceback (most recent call last): [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] yield resources [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self.driver.spawn(context, instance, image_meta, [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self._fetch_image_if_missing(context, vi) [ 2012.381111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] image_cache(vi, tmp_image_ds_loc) [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] vm_util.copy_virtual_disk( [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] session._wait_for_task(vmdk_copy_task) [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return self.wait_for_task(task_ref) [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return evt.wait() [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] result = hub.switch() [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2012.381410] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return self.greenlet.switch() [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self.f(*self.args, **self.kw) [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] raise exceptions.translate_fault(task_info.error) [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Faults: ['InvalidArgument'] [ 2012.381713] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] [ 2012.381713] env[68571]: INFO nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Terminating instance [ 2012.382950] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2012.383191] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2012.383435] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-11114e52-fea6-4b09-b480-fc656f89267f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.385747] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2012.385941] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2012.386660] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b323e2ba-d59c-4e14-984c-8c624f8a8e8c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.393281] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2012.393502] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-40caedf5-973a-4d24-8f57-d5b566093fdf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.395595] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2012.395762] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2012.396678] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-907cf278-b1fb-4511-940f-5f5ed4f5b30b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.401469] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for the task: (returnval){ [ 2012.401469] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52974976-72c2-23d0-bfa3-733dc7475997" [ 2012.401469] env[68571]: _type = "Task" [ 2012.401469] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2012.415303] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2012.415492] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Creating directory with path [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2012.415707] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-0d897a83-9818-4043-84b1-87bb72d5b062 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.436585] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Created directory with path [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2012.436777] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Fetch image to [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2012.436945] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2012.437759] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ec6990f-25e6-4656-9ad8-a315d09ceac4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.444672] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b1d19cac-40d2-4c45-9c62-ff26ecae5e2a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.454748] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c2744f5-29e7-4e1a-b0b2-701951f819ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.489617] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a67c43b-6fc9-480c-9138-77151ecf611e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.492311] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2012.492497] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2012.492671] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleting the datastore file [datastore1] 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2012.492916] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-96dec4db-8606-4e05-88af-b2b42c67060c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.498718] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-99189c38-2a4d-4ca0-b610-f1b92bf9c09a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2012.500458] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for the task: (returnval){ [ 2012.500458] env[68571]: value = "task-3467775" [ 2012.500458] env[68571]: _type = "Task" [ 2012.500458] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2012.508308] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467775, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2012.529677] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2012.583807] env[68571]: DEBUG oslo_vmware.rw_handles [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2012.643027] env[68571]: DEBUG oslo_vmware.rw_handles [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2012.643260] env[68571]: DEBUG oslo_vmware.rw_handles [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2013.010976] env[68571]: DEBUG oslo_vmware.api [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Task: {'id': task-3467775, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.072154} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2013.011309] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2013.011445] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2013.011624] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2013.011797] env[68571]: INFO nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Took 0.63 seconds to destroy the instance on the hypervisor. [ 2013.013862] env[68571]: DEBUG nova.compute.claims [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2013.014043] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2013.014276] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2013.168104] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-3fb920db-7977-4066-b1f2-a916f1652d0a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.175373] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a667c72-69c7-4259-a3b0-d11fcab67feb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.204505] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-51c81606-0282-4da2-8341-b9ff94f90ef4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.211196] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0322ca84-72ea-432b-8b5e-e8dff5894941 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.225482] env[68571]: DEBUG nova.compute.provider_tree [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2013.234007] env[68571]: DEBUG nova.scheduler.client.report [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2013.247579] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.233s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2013.248111] env[68571]: ERROR nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2013.248111] env[68571]: Faults: ['InvalidArgument'] [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Traceback (most recent call last): [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self.driver.spawn(context, instance, image_meta, [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self._fetch_image_if_missing(context, vi) [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] image_cache(vi, tmp_image_ds_loc) [ 2013.248111] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] vm_util.copy_virtual_disk( [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] session._wait_for_task(vmdk_copy_task) [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return self.wait_for_task(task_ref) [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return evt.wait() [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] result = hub.switch() [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] return self.greenlet.switch() [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2013.248435] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] self.f(*self.args, **self.kw) [ 2013.248878] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2013.248878] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] raise exceptions.translate_fault(task_info.error) [ 2013.248878] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2013.248878] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Faults: ['InvalidArgument'] [ 2013.248878] env[68571]: ERROR nova.compute.manager [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] [ 2013.248878] env[68571]: DEBUG nova.compute.utils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2013.250198] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Build of instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 was re-scheduled: A specified parameter was not correct: fileType [ 2013.250198] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2013.250573] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2013.250785] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2013.251098] env[68571]: DEBUG nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2013.251098] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2013.570146] env[68571]: DEBUG nova.network.neutron [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2013.581157] env[68571]: INFO nova.compute.manager [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Took 0.33 seconds to deallocate network for instance. [ 2013.680528] env[68571]: INFO nova.scheduler.client.report [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Deleted allocations for instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 [ 2013.701830] env[68571]: DEBUG oslo_concurrency.lockutils [None req-75f9f3c0-4608-4691-b381-fe4e0ad89fd1 tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 508.674s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2013.702167] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 313.206s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2013.702397] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Acquiring lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2013.702619] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2013.702779] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2013.704718] env[68571]: INFO nova.compute.manager [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Terminating instance [ 2013.706495] env[68571]: DEBUG nova.compute.manager [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2013.706897] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2013.707181] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c42c98e7-0e7d-4f83-8cc1-5475bf756802 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.715856] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5882511-718c-4a33-aa5f-15b80098b7af {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2013.743409] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2 could not be found. [ 2013.743615] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2013.743848] env[68571]: INFO nova.compute.manager [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2013.744065] env[68571]: DEBUG oslo.service.loopingcall [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2013.744329] env[68571]: DEBUG nova.compute.manager [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2013.744423] env[68571]: DEBUG nova.network.neutron [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2013.766443] env[68571]: DEBUG nova.network.neutron [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2013.774574] env[68571]: INFO nova.compute.manager [-] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] Took 0.03 seconds to deallocate network for instance. [ 2013.858175] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3cf20cfc-35c6-4896-bfb7-a5471fa1665a tempest-AttachVolumeNegativeTest-833435696 tempest-AttachVolumeNegativeTest-833435696-project-member] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.156s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2013.858984] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 292.362s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2013.859188] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2] During sync_power_state the instance has a pending task (deleting). Skip. [ 2013.859361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "4063caa5-90b5-4a18-b3b3-4bc91f2e9bb2" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.491048] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2029.491048] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2029.503440] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2029.503647] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.503812] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2029.503968] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2029.505105] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-54390b43-5d9c-42eb-b810-f4711503757a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.514125] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f993bfe-7ce2-4651-82eb-64f6446aa276 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.529767] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a2fcaea7-b0f9-44b4-9c28-2c191135d7f5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.535935] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8c236b2f-d418-4d01-91c7-b7a5cb56fad3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.564206] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180922MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2029.564339] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2029.564525] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2029.691617] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.691782] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.691911] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692045] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692217] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692340] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692458] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692572] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2029.692759] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 8 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2029.692894] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1536MB phys_disk=200GB used_disk=8GB total_vcpus=48 used_vcpus=8 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2029.707688] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2029.720056] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2029.720249] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2029.730321] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2029.747072] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2029.838362] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-78747307-910a-48dd-93b4-52c2f36b1063 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.846127] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-aad5abaa-b862-438d-a094-e8ab8f438316 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.877293] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9661e037-77f7-4c04-91e5-a9d587e8cef3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.884252] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-699fa402-0fc2-42f8-a48f-6bf4c21c088f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2029.896805] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2029.905481] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2029.918613] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2029.918795] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.354s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2030.918312] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2032.489716] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2033.489613] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2034.484622] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.489444] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2036.489747] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2036.489747] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2036.507465] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.507619] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.507752] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.507879] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.508012] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.508139] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.508262] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.508380] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2036.508518] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2037.489594] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2042.490622] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2042.490930] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2042.491062] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2042.491184] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2042.503076] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 0 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2043.489111] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2043.489269] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2054.490064] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2062.492707] env[68571]: WARNING oslo_vmware.rw_handles [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2062.492707] env[68571]: ERROR oslo_vmware.rw_handles [ 2062.493374] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2062.495293] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2062.495529] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Copying Virtual Disk [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/fdd8ee74-6c0f-4be8-bdc1-8a36348e4f63/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2062.495820] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-7368f5fd-6933-43a5-b12e-57b76b9f93ff {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2062.503654] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for the task: (returnval){ [ 2062.503654] env[68571]: value = "task-3467776" [ 2062.503654] env[68571]: _type = "Task" [ 2062.503654] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2062.511224] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Task: {'id': task-3467776, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2063.014144] env[68571]: DEBUG oslo_vmware.exceptions [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2063.015017] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2063.015017] env[68571]: ERROR nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2063.015017] env[68571]: Faults: ['InvalidArgument'] [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Traceback (most recent call last): [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] yield resources [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self.driver.spawn(context, instance, image_meta, [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2063.015017] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self._fetch_image_if_missing(context, vi) [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] image_cache(vi, tmp_image_ds_loc) [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] vm_util.copy_virtual_disk( [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] session._wait_for_task(vmdk_copy_task) [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return self.wait_for_task(task_ref) [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return evt.wait() [ 2063.015354] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] result = hub.switch() [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return self.greenlet.switch() [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self.f(*self.args, **self.kw) [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] raise exceptions.translate_fault(task_info.error) [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Faults: ['InvalidArgument'] [ 2063.015833] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] [ 2063.015833] env[68571]: INFO nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Terminating instance [ 2063.016857] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2063.017076] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2063.017322] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-451d4d38-9f67-40a5-a533-6988b2435dd9 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.019440] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2063.019653] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2063.020381] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1592ea59-36c7-480b-8c07-aafdaf41e146 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.027221] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2063.027437] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-fcb35196-3ce6-4dff-84bf-8b55274a73d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.029519] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2063.029692] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2063.030615] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-0ab8c876-85ab-4d74-8339-cfd1932dcd2b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.035086] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for the task: (returnval){ [ 2063.035086] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52441425-1d5d-cd8a-bc58-d61bfa3a0e8a" [ 2063.035086] env[68571]: _type = "Task" [ 2063.035086] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2063.049046] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2063.049266] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Creating directory with path [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2063.049469] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-5722d81b-84d0-4484-85c6-33e9b11659f5 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.068462] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Created directory with path [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2063.068651] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Fetch image to [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2063.068827] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2063.069593] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-99605355-0f78-47fc-b491-ec7f91770ad8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.076291] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-eada6996-4d03-4ffe-9403-a40b690e2cbb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.084997] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-309be9b1-7e53-4b95-b7a2-22e2d777bd2d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.116050] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5acfeba4-b3ec-4a6f-af6d-34afd05566b4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.118421] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2063.118613] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2063.118784] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Deleting the datastore file [datastore1] 8506e00f-2b77-4fa1-804a-8e548b78ee7d {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2063.119007] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b53ff9f0-9119-4c9f-8629-d14acef5da57 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.123535] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-c9dc0cd0-c303-4756-b2bc-004e644713b1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.126139] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for the task: (returnval){ [ 2063.126139] env[68571]: value = "task-3467778" [ 2063.126139] env[68571]: _type = "Task" [ 2063.126139] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2063.133830] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Task: {'id': task-3467778, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2063.146464] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2063.281587] env[68571]: DEBUG oslo_vmware.rw_handles [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2063.340669] env[68571]: DEBUG oslo_vmware.rw_handles [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2063.340872] env[68571]: DEBUG oslo_vmware.rw_handles [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2063.636121] env[68571]: DEBUG oslo_vmware.api [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Task: {'id': task-3467778, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.069272} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2063.636418] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2063.636551] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2063.636720] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2063.636890] env[68571]: INFO nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2063.638969] env[68571]: DEBUG nova.compute.claims [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2063.639155] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2063.639366] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2063.786328] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e8d4c3f-dda0-418d-8963-1b2365a78cef {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.793151] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d03dc135-7d24-4058-9a4b-f1a531e32300 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.823629] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ef376ff5-091e-48c0-a3ef-846c965e9b5d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.830472] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-530b5fbe-bb63-4a2f-bf0b-484ac013192d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2063.843288] env[68571]: DEBUG nova.compute.provider_tree [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2063.851657] env[68571]: DEBUG nova.scheduler.client.report [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2063.866309] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.227s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2063.866873] env[68571]: ERROR nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2063.866873] env[68571]: Faults: ['InvalidArgument'] [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Traceback (most recent call last): [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self.driver.spawn(context, instance, image_meta, [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self._fetch_image_if_missing(context, vi) [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] image_cache(vi, tmp_image_ds_loc) [ 2063.866873] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] vm_util.copy_virtual_disk( [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] session._wait_for_task(vmdk_copy_task) [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return self.wait_for_task(task_ref) [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return evt.wait() [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] result = hub.switch() [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] return self.greenlet.switch() [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2063.867188] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] self.f(*self.args, **self.kw) [ 2063.867519] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2063.867519] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] raise exceptions.translate_fault(task_info.error) [ 2063.867519] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2063.867519] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Faults: ['InvalidArgument'] [ 2063.867519] env[68571]: ERROR nova.compute.manager [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] [ 2063.867692] env[68571]: DEBUG nova.compute.utils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2063.869035] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Build of instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d was re-scheduled: A specified parameter was not correct: fileType [ 2063.869035] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2063.869406] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2063.869577] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2063.869748] env[68571]: DEBUG nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2063.869908] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2064.185029] env[68571]: DEBUG nova.network.neutron [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2064.196683] env[68571]: INFO nova.compute.manager [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Took 0.33 seconds to deallocate network for instance. [ 2064.286383] env[68571]: INFO nova.scheduler.client.report [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Deleted allocations for instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d [ 2064.307094] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9531639b-ac4a-468b-ac48-7fd5d3b9b479 tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 514.790s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2064.307361] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 342.811s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2064.307552] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] During sync_power_state the instance has a pending task (spawning). Skip. [ 2064.307724] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2064.307953] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 318.475s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2064.308177] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Acquiring lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2064.308379] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2064.308545] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2064.310496] env[68571]: INFO nova.compute.manager [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Terminating instance [ 2064.312427] env[68571]: DEBUG nova.compute.manager [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2064.312528] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2064.313011] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c284b953-0786-4cac-86d4-e9b368d74749 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2064.323188] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-91cb2f6b-f4be-4d30-9393-151aacbde31e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2064.350789] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 8506e00f-2b77-4fa1-804a-8e548b78ee7d could not be found. [ 2064.350981] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2064.351166] env[68571]: INFO nova.compute.manager [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2064.351398] env[68571]: DEBUG oslo.service.loopingcall [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2064.351633] env[68571]: DEBUG nova.compute.manager [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2064.351729] env[68571]: DEBUG nova.network.neutron [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2064.374329] env[68571]: DEBUG nova.network.neutron [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2064.382971] env[68571]: INFO nova.compute.manager [-] [instance: 8506e00f-2b77-4fa1-804a-8e548b78ee7d] Took 0.03 seconds to deallocate network for instance. [ 2064.466908] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9d05d2f1-80d2-4deb-afbc-5817b578127b tempest-ServersTestJSON-433190037 tempest-ServersTestJSON-433190037-project-member] Lock "8506e00f-2b77-4fa1-804a-8e548b78ee7d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.497176] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2089.497457] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2089.510642] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.510870] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.511051] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2089.511215] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2089.512376] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-451abe53-4509-479d-82d1-d71dbb7e48e2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.521541] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2c3e2d06-aa08-4e63-8e5f-ffdfffcef5d6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.535524] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e2a703df-760d-4d8a-8dc0-68b682aacbd3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.541775] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-392dfb3d-b0f5-4a36-be13-c99551e44fc4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.570827] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180879MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2089.570981] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2089.571188] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2089.628503] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 7fd03349-420c-4076-959c-31562e95098d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.628655] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.628785] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.628910] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.629040] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.629164] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.629283] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2089.629458] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2089.629596] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2089.711012] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ebb8a225-a125-4243-804b-f380a7927cbd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.718375] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e7e7ce90-d0ef-49b3-9ccb-bb1445af35ca {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.747044] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2e7da9a9-1125-4a11-90ee-0fe6637696a6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.753600] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fc20ab5b-39aa-4043-bf52-4dc1b1d2ccf8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2089.765888] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2089.773538] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2089.786560] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2089.786746] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.216s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2090.779410] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2093.489456] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2094.490162] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2095.484884] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.490402] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2097.490721] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2097.490721] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2097.508298] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.508446] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.508577] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.508703] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.508825] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.508945] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.509079] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2097.509203] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2097.509653] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.490450] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2102.490765] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2112.486273] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2112.500965] env[68571]: WARNING oslo_vmware.rw_handles [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2112.500965] env[68571]: ERROR oslo_vmware.rw_handles [ 2112.502556] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2112.503521] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2112.503755] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Copying Virtual Disk [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/9006eab2-135e-4768-a1f2-54892bdb5ab8/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2112.504041] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-41b23910-14b8-4928-9016-48b936082558 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2112.515290] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for the task: (returnval){ [ 2112.515290] env[68571]: value = "task-3467779" [ 2112.515290] env[68571]: _type = "Task" [ 2112.515290] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2112.523385] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Task: {'id': task-3467779, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2113.025898] env[68571]: DEBUG oslo_vmware.exceptions [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2113.026491] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2113.026728] env[68571]: ERROR nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2113.026728] env[68571]: Faults: ['InvalidArgument'] [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] Traceback (most recent call last): [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] yield resources [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self.driver.spawn(context, instance, image_meta, [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self._fetch_image_if_missing(context, vi) [ 2113.026728] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] image_cache(vi, tmp_image_ds_loc) [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] vm_util.copy_virtual_disk( [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] session._wait_for_task(vmdk_copy_task) [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return self.wait_for_task(task_ref) [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return evt.wait() [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] result = hub.switch() [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2113.027113] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return self.greenlet.switch() [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self.f(*self.args, **self.kw) [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] raise exceptions.translate_fault(task_info.error) [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] Faults: ['InvalidArgument'] [ 2113.027399] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] [ 2113.027399] env[68571]: INFO nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Terminating instance [ 2113.028565] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2113.028772] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2113.030741] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2113.030931] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2113.031193] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-398eacac-5a6c-4a12-8f09-9e010252d58a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.033385] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7ba5977b-e3f0-47f1-b5ec-be0c8114979f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.039905] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2113.040122] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-dab039de-99b6-4888-9ea8-d03b08365183 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.042131] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2113.042305] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2113.043237] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-f455f2ab-6f89-4878-87a0-0c052b50b6ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.047850] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for the task: (returnval){ [ 2113.047850] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52fd9967-c400-acd3-5382-01712038720a" [ 2113.047850] env[68571]: _type = "Task" [ 2113.047850] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2113.061340] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2113.061552] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Creating directory with path [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2113.061756] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b5fc1153-814c-4bf9-8133-c24fcf092bf8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.081352] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Created directory with path [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2113.081546] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Fetch image to [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2113.081717] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2113.082481] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4067b1a6-19ef-4966-9eb1-29f8747561e7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.090230] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f308b7af-1f60-4ded-9573-f1b7d9befcea {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.099092] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5ed78bb6-6e39-41c9-bca0-85babad5df6e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.130928] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5c4c32dd-fbe1-47cb-a6aa-a7524b48b21b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.133337] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2113.133532] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2113.133703] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Deleting the datastore file [datastore1] 7fd03349-420c-4076-959c-31562e95098d {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2113.133924] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-0d85efbb-bdf5-49ba-aa7f-a5f6e3fb4008 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.138403] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-fe72d38f-1a86-40bb-b3a5-84426aa569d0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.140974] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for the task: (returnval){ [ 2113.140974] env[68571]: value = "task-3467781" [ 2113.140974] env[68571]: _type = "Task" [ 2113.140974] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2113.148522] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Task: {'id': task-3467781, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2113.160957] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2113.277894] env[68571]: DEBUG oslo_vmware.rw_handles [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2113.336190] env[68571]: DEBUG oslo_vmware.rw_handles [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2113.336368] env[68571]: DEBUG oslo_vmware.rw_handles [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Closing write handle for https://esx7c1n1.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2113.652069] env[68571]: DEBUG oslo_vmware.api [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Task: {'id': task-3467781, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.073716} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2113.652069] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2113.652069] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2113.652069] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2113.652500] env[68571]: INFO nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Took 0.62 seconds to destroy the instance on the hypervisor. [ 2113.654266] env[68571]: DEBUG nova.compute.claims [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2113.654448] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2113.654659] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2113.774840] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-df5747e0-183f-4f20-9174-748c1f769e9a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.781636] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-20cb5823-9418-40d9-8f0e-0ac19b461c59 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.810572] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-504c954c-bffe-4a57-b209-d5b895ad44c6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.817432] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6e3a6ce3-c8be-4b24-aa52-181c69d36081 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2113.830196] env[68571]: DEBUG nova.compute.provider_tree [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2113.838490] env[68571]: DEBUG nova.scheduler.client.report [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2113.852886] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.198s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2113.853464] env[68571]: ERROR nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2113.853464] env[68571]: Faults: ['InvalidArgument'] [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] Traceback (most recent call last): [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self.driver.spawn(context, instance, image_meta, [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self._fetch_image_if_missing(context, vi) [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] image_cache(vi, tmp_image_ds_loc) [ 2113.853464] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] vm_util.copy_virtual_disk( [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] session._wait_for_task(vmdk_copy_task) [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return self.wait_for_task(task_ref) [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return evt.wait() [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] result = hub.switch() [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] return self.greenlet.switch() [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2113.853798] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] self.f(*self.args, **self.kw) [ 2113.854292] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2113.854292] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] raise exceptions.translate_fault(task_info.error) [ 2113.854292] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2113.854292] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] Faults: ['InvalidArgument'] [ 2113.854292] env[68571]: ERROR nova.compute.manager [instance: 7fd03349-420c-4076-959c-31562e95098d] [ 2113.854292] env[68571]: DEBUG nova.compute.utils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2113.855633] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Build of instance 7fd03349-420c-4076-959c-31562e95098d was re-scheduled: A specified parameter was not correct: fileType [ 2113.855633] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2113.856014] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2113.856206] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2113.856374] env[68571]: DEBUG nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2113.856538] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2114.178212] env[68571]: DEBUG nova.network.neutron [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2114.192402] env[68571]: INFO nova.compute.manager [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Took 0.34 seconds to deallocate network for instance. [ 2114.288478] env[68571]: INFO nova.scheduler.client.report [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Deleted allocations for instance 7fd03349-420c-4076-959c-31562e95098d [ 2114.309146] env[68571]: DEBUG oslo_concurrency.lockutils [None req-3ef72854-88ac-4a0c-9f62-06231eb5595c tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 541.145s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2114.309401] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "7fd03349-420c-4076-959c-31562e95098d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 392.813s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2114.309589] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 7fd03349-420c-4076-959c-31562e95098d] During sync_power_state the instance has a pending task (spawning). Skip. [ 2114.309759] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "7fd03349-420c-4076-959c-31562e95098d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2114.309989] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 345.411s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2114.310226] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Acquiring lock "7fd03349-420c-4076-959c-31562e95098d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2114.310428] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2114.310589] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2114.312490] env[68571]: INFO nova.compute.manager [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Terminating instance [ 2114.314130] env[68571]: DEBUG nova.compute.manager [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2114.314340] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2114.314805] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-e2f7d912-57f4-413d-a549-906d1c8039ae {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.324141] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f9198cbb-5920-4157-886b-bb2e0b8229da {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2114.351303] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 7fd03349-420c-4076-959c-31562e95098d could not be found. [ 2114.351529] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2114.351709] env[68571]: INFO nova.compute.manager [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] [instance: 7fd03349-420c-4076-959c-31562e95098d] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2114.351954] env[68571]: DEBUG oslo.service.loopingcall [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2114.352213] env[68571]: DEBUG nova.compute.manager [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2114.352313] env[68571]: DEBUG nova.network.neutron [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2114.376015] env[68571]: DEBUG nova.network.neutron [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2114.383648] env[68571]: INFO nova.compute.manager [-] [instance: 7fd03349-420c-4076-959c-31562e95098d] Took 0.03 seconds to deallocate network for instance. [ 2114.469398] env[68571]: DEBUG oslo_concurrency.lockutils [None req-b6674472-2035-459b-9b56-fb9151b3d6b5 tempest-AttachVolumeTestJSON-1020281048 tempest-AttachVolumeTestJSON-1020281048-project-member] Lock "7fd03349-420c-4076-959c-31562e95098d" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.159s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2140.323219] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2141.166602] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2141.166831] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2141.179105] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Starting instance... {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2406}} [ 2141.227407] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2141.227684] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2141.229059] env[68571]: INFO nova.compute.claims [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Claim successful on node domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 [ 2141.352653] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05c5de0d-bea3-438d-a511-af8d470bffa0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.360278] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c0cbc0cc-6562-4a50-8909-2b5a68bfbf1a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.389549] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e21768f0-0bde-456b-b166-478e104f85d8 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.396376] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7704a7f5-20fd-448b-8aed-8e58d9342d25 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.410059] env[68571]: DEBUG nova.compute.provider_tree [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2141.418476] env[68571]: DEBUG nova.scheduler.client.report [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2141.432898] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.205s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2141.433390] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Start building networks asynchronously for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2803}} [ 2141.463775] env[68571]: DEBUG nova.compute.utils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Using /dev/sd instead of None {{(pid=68571) get_next_device_name /opt/stack/nova/nova/compute/utils.py:238}} [ 2141.464974] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Allocating IP information in the background. {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1956}} [ 2141.465162] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] allocate_for_instance() {{(pid=68571) allocate_for_instance /opt/stack/nova/nova/network/neutron.py:1156}} [ 2141.473534] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Start building block device mappings for instance. {{(pid=68571) _build_resources /opt/stack/nova/nova/compute/manager.py:2838}} [ 2141.546301] env[68571]: DEBUG nova.policy [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd34e5361b36c4dc5824b0f42a37e6bb8', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '290427ab03f446ce9297ea393c083ff9', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} {{(pid=68571) authorize /opt/stack/nova/nova/policy.py:203}} [ 2141.555839] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Start spawning the instance on the hypervisor. {{(pid=68571) _build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2612}} [ 2141.580030] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Getting desirable topologies for flavor Flavor(created_at=2025-04-16T21:24:55Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='42',id=11,is_public=True,memory_mb=128,name='m1.nano',projects=,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='879b8eacf4b511d84bac79c7fe0e0d0a',container_format='bare',created_at=2025-04-16T21:24:40Z,direct_url=,disk_format='vmdk',id=6e7bf233-3ffe-4b3b-a510-62353d0292a6,min_disk=0,min_ram=0,name='cirros-d240228-sparse;paraVirtual;vmxnet3',owner='129da41d4b1a4202be57f86562f628cb',properties=ImageMetaProps,protected=,size=21318656,status='active',tags=,updated_at=2025-04-16T21:24:40Z,virtual_size=,visibility=), allow threads: False {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:563}} [ 2141.580268] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:348}} [ 2141.580421] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image limits 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:352}} [ 2141.580599] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Flavor pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:388}} [ 2141.580765] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Image pref 0:0:0 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:392}} [ 2141.580968] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 {{(pid=68571) get_cpu_topology_constraints /opt/stack/nova/nova/virt/hardware.py:430}} [ 2141.581202] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:569}} [ 2141.581363] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Build topologies for 1 vcpu(s) 1:1:1 {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:471}} [ 2141.581527] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Got 1 possible topologies {{(pid=68571) _get_possible_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:501}} [ 2141.581689] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:575}} [ 2141.581856] env[68571]: DEBUG nova.virt.hardware [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] {{(pid=68571) _get_desirable_cpu_topologies /opt/stack/nova/nova/virt/hardware.py:577}} [ 2141.582694] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12cd341a-e7a5-4585-834b-7f307e987fd3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.590109] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cefb94d-176c-46d8-a3be-dc42a2a5511c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2141.848875] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Successfully created port: a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) _create_port_minimal /opt/stack/nova/nova/network/neutron.py:548}} [ 2142.488088] env[68571]: DEBUG nova.compute.manager [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Received event network-vif-plugged-a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2142.488330] env[68571]: DEBUG oslo_concurrency.lockutils [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] Acquiring lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2142.488517] env[68571]: DEBUG oslo_concurrency.lockutils [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] Lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2142.488692] env[68571]: DEBUG oslo_concurrency.lockutils [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] Lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.._pop_event" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2142.488856] env[68571]: DEBUG nova.compute.manager [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] No waiting events found dispatching network-vif-plugged-a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) pop_instance_event /opt/stack/nova/nova/compute/manager.py:320}} [ 2142.489080] env[68571]: WARNING nova.compute.manager [req-25324def-0c5e-4624-8097-30c72dc9199d req-92914d16-75f9-4db9-9de6-3b73b7434c40 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Received unexpected event network-vif-plugged-a271d94d-cb0b-4da8-8066-d95875b20449 for instance with vm_state building and task_state spawning. [ 2142.571160] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Successfully updated port: a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) _update_port /opt/stack/nova/nova/network/neutron.py:586}} [ 2142.587607] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2142.587758] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2142.587894] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2142.625572] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2142.773015] env[68571]: DEBUG nova.network.neutron [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Updating instance_info_cache with network_info: [{"id": "a271d94d-cb0b-4da8-8066-d95875b20449", "address": "fa:16:3e:19:d7:1e", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa271d94d-cb", "ovs_interfaceid": "a271d94d-cb0b-4da8-8066-d95875b20449", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2142.785422] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2142.785718] env[68571]: DEBUG nova.compute.manager [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Instance network_info: |[{"id": "a271d94d-cb0b-4da8-8066-d95875b20449", "address": "fa:16:3e:19:d7:1e", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa271d94d-cb", "ovs_interfaceid": "a271d94d-cb0b-4da8-8066-d95875b20449", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| {{(pid=68571) _allocate_network_async /opt/stack/nova/nova/compute/manager.py:1971}} [ 2142.786137] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Instance VIF info [{'network_name': 'br-int', 'mac_address': 'fa:16:3e:19:d7:1e', 'network_ref': {'type': 'OpaqueNetwork', 'network-id': '2180b40f-2bb0-47da-ba80-c2fbe7f98af0', 'network-type': 'nsx.LogicalSwitch', 'use-external-id': True}, 'iface_id': 'a271d94d-cb0b-4da8-8066-d95875b20449', 'vif_model': 'vmxnet3'}] {{(pid=68571) build_virtual_machine /opt/stack/nova/nova/virt/vmwareapi/vmops.py:279}} [ 2142.793558] env[68571]: DEBUG oslo.service.loopingcall [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.virt.vmwareapi.vm_util.create_vm to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2142.793986] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Creating VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1327}} [ 2142.794230] env[68571]: DEBUG oslo_vmware.service [-] Invoking Folder.CreateVM_Task with opID=oslo.vmware-98b0119f-6bf0-4500-8c53-98ced63bea50 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2142.814129] env[68571]: DEBUG oslo_vmware.api [-] Waiting for the task: (returnval){ [ 2142.814129] env[68571]: value = "task-3467782" [ 2142.814129] env[68571]: _type = "Task" [ 2142.814129] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2142.821340] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467782, 'name': CreateVM_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2143.325113] env[68571]: DEBUG oslo_vmware.api [-] Task: {'id': task-3467782, 'name': CreateVM_Task, 'duration_secs': 0.370653} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2143.325294] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [-] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Created VM on the ESX host {{(pid=68571) create_vm /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1349}} [ 2143.325947] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2143.326158] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2143.326435] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired external semaphore "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:323}} [ 2143.326670] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-2b6e7305-857c-49fd-a91c-1cc571476263 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2143.331132] env[68571]: DEBUG oslo_vmware.api [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 2143.331132] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52398513-29ed-8608-c16f-77fb1893e3cf" [ 2143.331132] env[68571]: _type = "Task" [ 2143.331132] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2143.338525] env[68571]: DEBUG oslo_vmware.api [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52398513-29ed-8608-c16f-77fb1893e3cf, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2143.841516] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2143.841935] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Processing image 6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:624}} [ 2143.842059] env[68571]: DEBUG oslo_concurrency.lockutils [None req-2642877c-dfae-4550-b36b-2aefea0e68b7 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2144.514114] env[68571]: DEBUG nova.compute.manager [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Received event network-changed-a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11101}} [ 2144.514277] env[68571]: DEBUG nova.compute.manager [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Refreshing instance network info cache due to event network-changed-a271d94d-cb0b-4da8-8066-d95875b20449. {{(pid=68571) external_instance_event /opt/stack/nova/nova/compute/manager.py:11106}} [ 2144.514497] env[68571]: DEBUG oslo_concurrency.lockutils [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] Acquiring lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2144.514658] env[68571]: DEBUG oslo_concurrency.lockutils [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] Acquired lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2144.514824] env[68571]: DEBUG nova.network.neutron [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Refreshing network info cache for port a271d94d-cb0b-4da8-8066-d95875b20449 {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2007}} [ 2144.760417] env[68571]: DEBUG nova.network.neutron [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Updated VIF entry in instance network info cache for port a271d94d-cb0b-4da8-8066-d95875b20449. {{(pid=68571) _build_network_info_model /opt/stack/nova/nova/network/neutron.py:3482}} [ 2144.760859] env[68571]: DEBUG nova.network.neutron [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Updating instance_info_cache with network_info: [{"id": "a271d94d-cb0b-4da8-8066-d95875b20449", "address": "fa:16:3e:19:d7:1e", "network": {"id": "653e8d49-b7ab-4d09-aa68-b76012e5b38e", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-503364041-network", "subnets": [{"cidr": "192.168.128.0/28", "dns": [], "gateway": {"address": "192.168.128.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "192.168.128.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "192.168.128.2"}}], "meta": {"injected": false, "tenant_id": "290427ab03f446ce9297ea393c083ff9", "mtu": 8950, "physical_network": "default", "tunneled": false}}, "type": "ovs", "details": {"connectivity": "l2", "port_filter": true, "nsx-logical-switch-id": "2180b40f-2bb0-47da-ba80-c2fbe7f98af0", "external-id": "nsx-vlan-transportzone-970", "segmentation_id": 970, "bound_drivers": {"0": "nsxv3"}}, "devname": "tapa271d94d-cb", "ovs_interfaceid": "a271d94d-cb0b-4da8-8066-d95875b20449", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2144.770135] env[68571]: DEBUG oslo_concurrency.lockutils [req-0077c0a1-4150-40a2-b8f6-232a5ccd1ff2 req-9a387cb5-317b-424b-b83b-9c51b4e43335 service nova] Releasing lock "refresh_cache-1799a6b4-70c1-4a96-9bf9-4e855c11039f" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2149.489660] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2151.489609] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2151.489970] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2151.503320] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2151.503531] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2151.503738] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2151.503893] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2151.505029] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-00b35908-bacd-48b1-9ead-88256959bea1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.513483] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2f068a7c-ff3d-4dc7-9daa-b348f48cc634 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.527011] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d0e4cebe-e9c7-4aaf-97f6-53483f0026be {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.532921] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f245b0b0-e1df-4b8f-a94c-c5f9b45e1538 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.561093] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180900MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2151.561237] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2151.561420] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2151.649557] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.649723] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.649855] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.649978] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.650111] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.650229] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.650344] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1799a6b4-70c1-4a96-9bf9-4e855c11039f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2151.650525] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 7 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2151.650662] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1408MB phys_disk=200GB used_disk=7GB total_vcpus=48 used_vcpus=7 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2151.734899] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a56a0dd2-37de-4d5b-9872-6b23463718bb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.743275] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ec8e4a8e-9666-44dc-93a9-962ebca62742 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.772091] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8aeb3f8b-f0c9-42b6-b7b6-8ae6af700bac {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.778903] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d4ee4468-c42e-4528-8b2b-0d7b05210c64 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2151.791371] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2151.799275] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2151.815609] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2151.815790] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2153.815937] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2156.489923] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.485779] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.489243] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2157.489384] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2157.489502] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2157.511364] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.511621] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.511673] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.511761] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.511885] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.512018] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.512144] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2157.512267] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2159.489808] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2162.525773] env[68571]: WARNING oslo_vmware.rw_handles [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2162.525773] env[68571]: ERROR oslo_vmware.rw_handles [ 2162.526437] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2162.528653] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2162.528933] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Copying Virtual Disk [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/9bf1dcea-d191-4305-ab75-bdc7d500c763/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2162.529243] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-730bfa33-e4c4-49e3-ab6a-20823a181235 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2162.536762] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for the task: (returnval){ [ 2162.536762] env[68571]: value = "task-3467783" [ 2162.536762] env[68571]: _type = "Task" [ 2162.536762] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2162.544255] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Task: {'id': task-3467783, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2163.048218] env[68571]: DEBUG oslo_vmware.exceptions [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2163.048515] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2163.049064] env[68571]: ERROR nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2163.049064] env[68571]: Faults: ['InvalidArgument'] [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Traceback (most recent call last): [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] yield resources [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self.driver.spawn(context, instance, image_meta, [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self._fetch_image_if_missing(context, vi) [ 2163.049064] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] image_cache(vi, tmp_image_ds_loc) [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] vm_util.copy_virtual_disk( [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] session._wait_for_task(vmdk_copy_task) [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return self.wait_for_task(task_ref) [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return evt.wait() [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] result = hub.switch() [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2163.049405] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return self.greenlet.switch() [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self.f(*self.args, **self.kw) [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] raise exceptions.translate_fault(task_info.error) [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Faults: ['InvalidArgument'] [ 2163.049767] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] [ 2163.049767] env[68571]: INFO nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Terminating instance [ 2163.050859] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2163.051078] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2163.051318] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-b0eeb3dc-0518-4787-8a36-ee9e1122ad36 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.053541] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2163.054091] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2163.054516] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-474dfd6a-5f98-4320-9d0d-da95568ae2f3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.061168] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2163.061377] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-d2cca1e5-78e8-4ebd-801c-ca92b76b1385 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.063472] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2163.063646] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2163.064639] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-bb2001e8-d8fa-42db-9cd5-c6f9bdcb082e {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.069243] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 2163.069243] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e6b7a8-84fa-b2bf-0818-8874abac995e" [ 2163.069243] env[68571]: _type = "Task" [ 2163.069243] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2163.076174] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52e6b7a8-84fa-b2bf-0818-8874abac995e, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2163.124508] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2163.124710] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2163.124909] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Deleting the datastore file [datastore1] 5deee3f1-70a0-4c0d-bda6-365235ca0d78 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2163.125189] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-3733ee3b-06ea-4660-8a58-5c40dccc9856 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.130703] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for the task: (returnval){ [ 2163.130703] env[68571]: value = "task-3467785" [ 2163.130703] env[68571]: _type = "Task" [ 2163.130703] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2163.137957] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Task: {'id': task-3467785, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2163.579637] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2163.580113] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2163.580216] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-fa73b6e4-7618-446a-8df7-989ce015a70c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.591255] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2163.591424] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Fetch image to [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2163.591584] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2163.592336] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0bbc1e7c-e88c-45be-8adb-bfb7c7f83201 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.598512] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4feb58fc-e7bc-41a6-b821-016169fb6315 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.606935] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-06e8147f-e94a-44a4-90cc-4a254291f453 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.638742] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5b17d323-aaa2-4070-bbcb-bad985ab52e0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.645307] env[68571]: DEBUG oslo_vmware.api [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Task: {'id': task-3467785, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075476} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2163.646656] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2163.646844] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2163.647027] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2163.647206] env[68571]: INFO nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Took 0.59 seconds to destroy the instance on the hypervisor. [ 2163.648885] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-005e4033-1a52-4225-b0a3-90876f5ba796 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.650711] env[68571]: DEBUG nova.compute.claims [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2163.650890] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2163.651109] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2163.672429] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2163.722600] env[68571]: DEBUG oslo_vmware.rw_handles [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2163.781490] env[68571]: DEBUG oslo_vmware.rw_handles [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2163.781490] env[68571]: DEBUG oslo_vmware.rw_handles [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2163.831877] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dba8e87e-0251-4251-90b5-2bcc29e2a90a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.839513] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f47cccbf-6314-4b60-8b27-2d9dafb3a9cb {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.868464] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-843a37d7-da0d-4e51-9166-0afefa31181d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.875260] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9e35dd16-b951-49ee-9c0b-13ccbc83936b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2163.887815] env[68571]: DEBUG nova.compute.provider_tree [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2163.895925] env[68571]: DEBUG nova.scheduler.client.report [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2163.908823] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.258s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2163.909391] env[68571]: ERROR nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2163.909391] env[68571]: Faults: ['InvalidArgument'] [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Traceback (most recent call last): [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self.driver.spawn(context, instance, image_meta, [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self._fetch_image_if_missing(context, vi) [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] image_cache(vi, tmp_image_ds_loc) [ 2163.909391] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] vm_util.copy_virtual_disk( [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] session._wait_for_task(vmdk_copy_task) [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return self.wait_for_task(task_ref) [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return evt.wait() [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] result = hub.switch() [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] return self.greenlet.switch() [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2163.909681] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] self.f(*self.args, **self.kw) [ 2163.910642] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2163.910642] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] raise exceptions.translate_fault(task_info.error) [ 2163.910642] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2163.910642] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Faults: ['InvalidArgument'] [ 2163.910642] env[68571]: ERROR nova.compute.manager [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] [ 2163.910642] env[68571]: DEBUG nova.compute.utils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2163.911378] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Build of instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 was re-scheduled: A specified parameter was not correct: fileType [ 2163.911378] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2163.911746] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2163.911922] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2163.912104] env[68571]: DEBUG nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2163.912269] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2164.216142] env[68571]: DEBUG nova.network.neutron [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2164.230437] env[68571]: INFO nova.compute.manager [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Took 0.32 seconds to deallocate network for instance. [ 2164.321510] env[68571]: INFO nova.scheduler.client.report [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Deleted allocations for instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 [ 2164.346830] env[68571]: DEBUG oslo_concurrency.lockutils [None req-601781b7-d547-4d83-9dc6-a2c5e28fc044 tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 583.398s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2164.347279] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 387.592s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2164.347385] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Acquiring lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2164.347501] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2164.347661] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2164.349582] env[68571]: INFO nova.compute.manager [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Terminating instance [ 2164.353258] env[68571]: DEBUG nova.compute.manager [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2164.353452] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2164.353720] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-a7227539-23e7-4d2d-8d3b-ea58f2834768 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.364622] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-12fa47b5-a620-4772-906b-84662d5360bd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2164.390848] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 5deee3f1-70a0-4c0d-bda6-365235ca0d78 could not be found. [ 2164.391048] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2164.391220] env[68571]: INFO nova.compute.manager [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2164.391455] env[68571]: DEBUG oslo.service.loopingcall [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2164.391900] env[68571]: DEBUG nova.compute.manager [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2164.392008] env[68571]: DEBUG nova.network.neutron [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2164.414616] env[68571]: DEBUG nova.network.neutron [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2164.422296] env[68571]: INFO nova.compute.manager [-] [instance: 5deee3f1-70a0-4c0d-bda6-365235ca0d78] Took 0.03 seconds to deallocate network for instance. [ 2164.488898] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2164.489262] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2164.518562] env[68571]: DEBUG oslo_concurrency.lockutils [None req-9e739162-2179-4d1d-a385-510daa6bb14e tempest-ServersTestJSON-1811012872 tempest-ServersTestJSON-1811012872-project-member] Lock "5deee3f1-70a0-4c0d-bda6-365235ca0d78" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.171s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2170.910144] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2193.586462] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e19f2e3f-eda8-4cd9-8a28-d760fb59c8c5 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "17530424-18ad-4713-ae56-acbe585bd5d9" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2211.492792] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2212.536664] env[68571]: WARNING oslo_vmware.rw_handles [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2212.536664] env[68571]: ERROR oslo_vmware.rw_handles [ 2212.536664] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2212.539067] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2212.539197] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Copying Virtual Disk [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/c27fd0d1-6388-4915-b79b-604f9f918a44/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2212.539443] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-925ae147-b9a4-475d-b9c0-3ab6d2bdfbe4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2212.550550] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 2212.550550] env[68571]: value = "task-3467786" [ 2212.550550] env[68571]: _type = "Task" [ 2212.550550] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2212.558929] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467786, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2213.060019] env[68571]: DEBUG oslo_vmware.exceptions [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2213.060323] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2213.060874] env[68571]: ERROR nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2213.060874] env[68571]: Faults: ['InvalidArgument'] [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Traceback (most recent call last): [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] yield resources [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self.driver.spawn(context, instance, image_meta, [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self._fetch_image_if_missing(context, vi) [ 2213.060874] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] image_cache(vi, tmp_image_ds_loc) [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] vm_util.copy_virtual_disk( [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] session._wait_for_task(vmdk_copy_task) [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return self.wait_for_task(task_ref) [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return evt.wait() [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] result = hub.switch() [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2213.061200] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return self.greenlet.switch() [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self.f(*self.args, **self.kw) [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] raise exceptions.translate_fault(task_info.error) [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Faults: ['InvalidArgument'] [ 2213.061500] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] [ 2213.061500] env[68571]: INFO nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Terminating instance [ 2213.062703] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2213.062907] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2213.063163] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-104ae38a-d57a-4292-b1eb-fa5421519d39 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.066055] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2213.066055] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2213.066461] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e49c5c90-e740-4618-bd62-246eca8b8f44 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.072790] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2213.072995] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-577ceee2-60cb-4354-b963-e64575c8335c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.075133] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2213.075267] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2213.076214] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-d83fa5c7-e245-4ed0-8da1-db32394ecde1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.080891] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 2213.080891] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52437503-6303-1c0e-caf9-f5bed383ef01" [ 2213.080891] env[68571]: _type = "Task" [ 2213.080891] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2213.087535] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52437503-6303-1c0e-caf9-f5bed383ef01, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2213.139105] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2213.139337] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2213.139502] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleting the datastore file [datastore1] f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2213.139780] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-1f9c6936-7779-44fb-afa9-c3dda19a5517 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.146103] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 2213.146103] env[68571]: value = "task-3467788" [ 2213.146103] env[68571]: _type = "Task" [ 2213.146103] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2213.153560] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467788, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2213.489579] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2213.489830] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2213.501408] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2213.501615] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2213.501780] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2213.501937] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2213.502994] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8bd4b4d7-7a7d-4de4-ad76-9a29d6b18303 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.511418] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-597b9fb8-35d2-48a3-9b18-89276349a863 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.524775] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d33309fb-8647-4999-b256-f67f42cb186f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.531581] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e0cef923-0ad2-4c1a-a5f9-4f1f30c9e936 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.562294] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180903MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2213.562592] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2213.562592] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2213.590506] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2213.590748] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating directory with path [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2213.593928] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-f6c29323-5bae-4a27-b4d3-94e6c176c94a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.608273] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Created directory with path [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2213.608469] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Fetch image to [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2213.608637] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2213.609394] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6576d242-17b4-486a-b332-49fa9816c354 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.616260] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4ab67fb8-1e8c-4f85-8780-2093ce166434 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.625454] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b2fd0e5e-eb50-4a71-b519-9d59676a7a34 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.629686] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.629831] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.629956] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.630093] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.630214] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.630329] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1799a6b4-70c1-4a96-9bf9-4e855c11039f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2213.630504] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 6 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2213.630642] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1280MB phys_disk=200GB used_disk=6GB total_vcpus=48 used_vcpus=6 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2213.664995] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-7c143119-3c90-43c6-80be-139564844584 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.672339] env[68571]: DEBUG oslo_vmware.api [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': task-3467788, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.075216} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2213.673811] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2213.674011] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2213.674190] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2213.674393] env[68571]: INFO nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Took 0.61 seconds to destroy the instance on the hypervisor. [ 2213.676151] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-dda2a97d-0b10-4760-981d-0bb3278f6d41 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.678051] env[68571]: DEBUG nova.compute.claims [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2213.678184] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2213.696795] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2213.736332] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b766c95f-ba1f-40d3-9a0b-7760aefc12db {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.744914] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-61fe5657-7b73-4c8f-827e-0ddd527f723f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.777504] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbe3b751-69f0-43d6-866b-96daffbc7b10 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.784553] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6b3fbeb4-a0e1-46ae-af48-d9eb56ab6485 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.798621] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2213.802788] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2213.857768] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2213.862765] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2213.862765] env[68571]: DEBUG oslo_vmware.rw_handles [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2213.872078] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2213.872078] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.309s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2213.872253] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.194s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2213.987636] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-8232cced-d9a5-46c9-bba5-4563b71e942d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2213.995038] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-361a8832-ae0f-429f-9295-c4e316742f4f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.024685] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-72094a1d-fc5f-4190-978e-47adc8b14ec1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.031266] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c8eb7977-b4b2-4970-b2c2-fec3707abcb4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.043920] env[68571]: DEBUG nova.compute.provider_tree [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2214.051867] env[68571]: DEBUG nova.scheduler.client.report [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2214.064255] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.192s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2214.064780] env[68571]: ERROR nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2214.064780] env[68571]: Faults: ['InvalidArgument'] [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Traceback (most recent call last): [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self.driver.spawn(context, instance, image_meta, [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self._fetch_image_if_missing(context, vi) [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] image_cache(vi, tmp_image_ds_loc) [ 2214.064780] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] vm_util.copy_virtual_disk( [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] session._wait_for_task(vmdk_copy_task) [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return self.wait_for_task(task_ref) [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return evt.wait() [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] result = hub.switch() [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] return self.greenlet.switch() [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2214.065183] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] self.f(*self.args, **self.kw) [ 2214.065572] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2214.065572] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] raise exceptions.translate_fault(task_info.error) [ 2214.065572] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2214.065572] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Faults: ['InvalidArgument'] [ 2214.065572] env[68571]: ERROR nova.compute.manager [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] [ 2214.065572] env[68571]: DEBUG nova.compute.utils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2214.066813] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Build of instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 was re-scheduled: A specified parameter was not correct: fileType [ 2214.066813] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2214.067184] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2214.067353] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2214.067522] env[68571]: DEBUG nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2214.067701] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2214.361474] env[68571]: DEBUG nova.network.neutron [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2214.379327] env[68571]: INFO nova.compute.manager [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Took 0.31 seconds to deallocate network for instance. [ 2214.469332] env[68571]: INFO nova.scheduler.client.report [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Deleted allocations for instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 [ 2214.491550] env[68571]: DEBUG oslo_concurrency.lockutils [None req-46c5ac78-4769-45c1-a29d-4b6f3be75363 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 608.811s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2214.491816] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 413.172s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2214.492053] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquiring lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2214.492286] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2214.492441] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2214.494666] env[68571]: INFO nova.compute.manager [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Terminating instance [ 2214.496424] env[68571]: DEBUG nova.compute.manager [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2214.496622] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2214.497128] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-dccbbde7-f838-4e5e-8352-2b7f9f44710d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.506823] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-c7f52c75-dc33-4aff-841f-299b3da3faf3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2214.532063] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2 could not be found. [ 2214.532271] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2214.532455] env[68571]: INFO nova.compute.manager [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2214.532750] env[68571]: DEBUG oslo.service.loopingcall [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2214.532973] env[68571]: DEBUG nova.compute.manager [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2214.533087] env[68571]: DEBUG nova.network.neutron [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2214.560359] env[68571]: DEBUG nova.network.neutron [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2214.568164] env[68571]: INFO nova.compute.manager [-] [instance: f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2] Took 0.03 seconds to deallocate network for instance. [ 2214.652351] env[68571]: DEBUG oslo_concurrency.lockutils [None req-19d95806-f096-439f-82ed-e383733ddc97 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Lock "f5b58a01-b52a-466b-b5bc-8a1ea0c2ded2" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.160s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2215.873569] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.484641] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.489183] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2218.489332] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2218.489454] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2218.504677] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.504913] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.505158] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.505362] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.505572] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2218.505782] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2218.506242] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2221.490456] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.490273] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2225.490611] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2234.486234] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2261.807380] env[68571]: WARNING oslo_vmware.rw_handles [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2261.807380] env[68571]: ERROR oslo_vmware.rw_handles [ 2261.808039] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2261.810537] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2261.810815] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Copying Virtual Disk [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/1afaec9f-55f7-4698-a76e-4f03d8f022ff/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2261.811171] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-af78f89f-68e0-4ab2-bfb7-daa848cc543a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2261.818496] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 2261.818496] env[68571]: value = "task-3467789" [ 2261.818496] env[68571]: _type = "Task" [ 2261.818496] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2261.826813] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467789, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2262.328860] env[68571]: DEBUG oslo_vmware.exceptions [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2262.329154] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2262.329716] env[68571]: ERROR nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2262.329716] env[68571]: Faults: ['InvalidArgument'] [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Traceback (most recent call last): [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] yield resources [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self.driver.spawn(context, instance, image_meta, [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self._fetch_image_if_missing(context, vi) [ 2262.329716] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] image_cache(vi, tmp_image_ds_loc) [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] vm_util.copy_virtual_disk( [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] session._wait_for_task(vmdk_copy_task) [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return self.wait_for_task(task_ref) [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return evt.wait() [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] result = hub.switch() [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2262.330105] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return self.greenlet.switch() [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self.f(*self.args, **self.kw) [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] raise exceptions.translate_fault(task_info.error) [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Faults: ['InvalidArgument'] [ 2262.330488] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] [ 2262.330488] env[68571]: INFO nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Terminating instance [ 2262.331540] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2262.331739] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2262.331968] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-8432e68d-4ab5-44aa-a4bd-f9f4e53b5163 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.335036] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2262.335242] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2262.335941] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0b227018-4ed5-45c4-991f-29713cfc7492 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.342537] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2262.342748] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1cceb6c5-0129-4c25-ac7d-dfe331cc8686 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.344866] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2262.345094] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2262.346062] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-5d6bfb3e-59a0-405e-8a1e-74d0be9d94dc {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.350486] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for the task: (returnval){ [ 2262.350486] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52767469-098e-bd86-7142-5306b5b5a39f" [ 2262.350486] env[68571]: _type = "Task" [ 2262.350486] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2262.359686] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52767469-098e-bd86-7142-5306b5b5a39f, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2262.412542] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2262.412742] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2262.412934] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleting the datastore file [datastore1] 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2262.413194] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-40e9c98b-7102-401e-9109-a810d938af56 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.421095] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for the task: (returnval){ [ 2262.421095] env[68571]: value = "task-3467791" [ 2262.421095] env[68571]: _type = "Task" [ 2262.421095] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2262.428241] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467791, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2262.860694] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2262.860994] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Creating directory with path [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2262.861217] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-9f96b883-6561-4ddc-8900-bcdb1bc6678c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.872433] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Created directory with path [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2262.872622] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Fetch image to [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2262.872809] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2262.873639] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0af6817c-12bb-49e3-af92-125ad5cc54d3 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.881218] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-edf15e7c-dc7e-4730-bb05-99632c2a2b6f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.890275] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f8c254d3-b0d4-41af-bf7c-1e844e9bfdec {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.920111] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-fbbe0370-134c-46d0-a93e-1bcba6ae0334 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.930476] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-44bc2cd7-57bd-4a59-b614-ea2829771f69 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2262.932069] env[68571]: DEBUG oslo_vmware.api [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Task: {'id': task-3467791, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.077075} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2262.932308] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2262.932488] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2262.932658] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2262.932840] env[68571]: INFO nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Took 0.60 seconds to destroy the instance on the hypervisor. [ 2262.935071] env[68571]: DEBUG nova.compute.claims [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2262.935222] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2262.935440] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2262.957984] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2263.006523] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2263.065052] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2263.065262] env[68571]: DEBUG oslo_vmware.rw_handles [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Closing write handle for https://esx7c1n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2263.106733] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-ea5c1abd-eaf5-4ef6-b327-f3e88b3a93a0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.113510] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-81bc8633-a2c5-4f6d-bf22-64cac94d22f0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.143691] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-30eaa81a-a5dd-41a9-866a-6e9970b4879b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.151038] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e195b32e-2796-4ca8-8e67-8b7560deb758 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.164470] env[68571]: DEBUG nova.compute.provider_tree [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2263.176752] env[68571]: DEBUG nova.scheduler.client.report [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2263.193610] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.258s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2263.194182] env[68571]: ERROR nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2263.194182] env[68571]: Faults: ['InvalidArgument'] [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Traceback (most recent call last): [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self.driver.spawn(context, instance, image_meta, [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self._fetch_image_if_missing(context, vi) [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] image_cache(vi, tmp_image_ds_loc) [ 2263.194182] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] vm_util.copy_virtual_disk( [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] session._wait_for_task(vmdk_copy_task) [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return self.wait_for_task(task_ref) [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return evt.wait() [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] result = hub.switch() [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] return self.greenlet.switch() [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2263.194518] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] self.f(*self.args, **self.kw) [ 2263.194861] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2263.194861] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] raise exceptions.translate_fault(task_info.error) [ 2263.194861] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2263.194861] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Faults: ['InvalidArgument'] [ 2263.194861] env[68571]: ERROR nova.compute.manager [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] [ 2263.195061] env[68571]: DEBUG nova.compute.utils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2263.196853] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Build of instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 was re-scheduled: A specified parameter was not correct: fileType [ 2263.196853] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2263.197243] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2263.197437] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2263.197612] env[68571]: DEBUG nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2263.197787] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2263.507637] env[68571]: DEBUG nova.network.neutron [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2263.519456] env[68571]: INFO nova.compute.manager [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Took 0.32 seconds to deallocate network for instance. [ 2263.620583] env[68571]: INFO nova.scheduler.client.report [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Deleted allocations for instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 [ 2263.641875] env[68571]: DEBUG oslo_concurrency.lockutils [None req-a27db6ed-a97f-44f9-908b-bdb68e1c2043 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 514.679s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2263.642159] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 319.224s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2263.642388] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2263.642592] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2263.642755] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2263.644796] env[68571]: INFO nova.compute.manager [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Terminating instance [ 2263.646673] env[68571]: DEBUG nova.compute.manager [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2263.646767] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2263.647370] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-c13d50c5-ce40-4a3f-8b61-9ca8345ec6f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.656261] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-224ae544-8308-4037-9f38-553004914806 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2263.682397] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 62ce83ad-bb1b-4f78-8d0b-9b516290bac6 could not be found. [ 2263.682592] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2263.682769] env[68571]: INFO nova.compute.manager [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Took 0.04 seconds to destroy the instance on the hypervisor. [ 2263.683039] env[68571]: DEBUG oslo.service.loopingcall [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2263.683261] env[68571]: DEBUG nova.compute.manager [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2263.683360] env[68571]: DEBUG nova.network.neutron [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2263.707010] env[68571]: DEBUG nova.network.neutron [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2263.714634] env[68571]: INFO nova.compute.manager [-] [instance: 62ce83ad-bb1b-4f78-8d0b-9b516290bac6] Took 0.03 seconds to deallocate network for instance. [ 2263.798908] env[68571]: DEBUG oslo_concurrency.lockutils [None req-f229748c-3a80-4471-abc5-10783d125053 tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Lock "62ce83ad-bb1b-4f78-8d0b-9b516290bac6" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.157s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2272.489610] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2273.489440] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2274.488945] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2274.501048] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2274.501269] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2274.501437] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2274.501593] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2274.502689] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-f5437ec8-7509-4997-84f2-7b422b41e492 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.511529] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-6a3c35e1-ddaa-4cb2-a470-90dcfc240eea {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.525717] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-0a54f775-21cb-47f1-a465-2f177b908f52 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.532277] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-e3df1739-cef0-4f68-a304-95e4457ba8cd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.560310] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180763MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2274.560467] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2274.560663] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2274.609895] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2274.610075] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2274.610280] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2274.610343] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1799a6b4-70c1-4a96-9bf9-4e855c11039f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2274.610505] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 4 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2274.610640] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=1024MB phys_disk=200GB used_disk=4GB total_vcpus=48 used_vcpus=4 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2274.672051] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-b270f601-bc1a-4629-ae5f-9de3404af9e7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.679431] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4998315a-3d0e-4eff-b00f-2ecd999f0f04 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.709098] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5e55342a-3e29-4094-a377-e494b9701ec2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.715787] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-cd8dce46-7267-4e39-9e8c-527707adbd20 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2274.728623] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2274.736992] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2274.750602] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2274.750839] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2275.751936] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.485830] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2278.489421] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.490256] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2279.490634] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2279.490634] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2279.503883] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.504048] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.504164] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.504451] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2279.504451] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2282.489682] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2286.492626] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2286.492626] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2311.823746] env[68571]: WARNING oslo_vmware.rw_handles [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2311.823746] env[68571]: ERROR oslo_vmware.rw_handles [ 2311.824300] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2311.826451] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2311.826731] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Copying Virtual Disk [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/6bdcde6c-960d-4a6b-9d48-1dc4aa98f4e3/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2311.827049] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-11a54c7d-cd36-43a8-8dc8-de975e46e84c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2311.835344] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for the task: (returnval){ [ 2311.835344] env[68571]: value = "task-3467792" [ 2311.835344] env[68571]: _type = "Task" [ 2311.835344] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2311.843421] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Task: {'id': task-3467792, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2312.344763] env[68571]: DEBUG oslo_vmware.exceptions [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2312.345089] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2312.345932] env[68571]: ERROR nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2312.345932] env[68571]: Faults: ['InvalidArgument'] [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Traceback (most recent call last): [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] yield resources [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self.driver.spawn(context, instance, image_meta, [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self._fetch_image_if_missing(context, vi) [ 2312.345932] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] image_cache(vi, tmp_image_ds_loc) [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] vm_util.copy_virtual_disk( [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] session._wait_for_task(vmdk_copy_task) [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return self.wait_for_task(task_ref) [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return evt.wait() [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] result = hub.switch() [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2312.346417] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return self.greenlet.switch() [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self.f(*self.args, **self.kw) [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] raise exceptions.translate_fault(task_info.error) [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Faults: ['InvalidArgument'] [ 2312.346791] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] [ 2312.346791] env[68571]: INFO nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Terminating instance [ 2312.347838] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2312.348021] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2312.348267] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-ada56de6-848b-4b5b-8373-daf90699f115 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.350534] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2312.350695] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2312.350858] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2312.357246] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2312.357411] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2312.358096] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aade6311-4884-460f-89fb-d0a8ceea7f77 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.365246] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for the task: (returnval){ [ 2312.365246] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]5260824b-50b4-778c-4f61-182b020548f0" [ 2312.365246] env[68571]: _type = "Task" [ 2312.365246] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2312.373852] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]5260824b-50b4-778c-4f61-182b020548f0, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2312.417432] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2312.476890] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2312.486284] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Releasing lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2312.486694] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2312.486893] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2312.487946] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-481c3b0d-0a45-47e6-86d5-efdc08f22a12 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.495619] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2312.495844] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-1a2c15e0-c7b8-48e8-a206-4922b36b3441 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.521445] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2312.521638] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2312.521813] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Deleting the datastore file [datastore1] ad3a9183-0e9e-44df-b920-b8b8360a65e5 {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2312.522063] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-b63f33da-e690-44c9-b0ce-01b5ef5a96f7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.528660] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for the task: (returnval){ [ 2312.528660] env[68571]: value = "task-3467794" [ 2312.528660] env[68571]: _type = "Task" [ 2312.528660] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2312.535775] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Task: {'id': task-3467794, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2312.875718] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2312.875981] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Creating directory with path [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2312.876240] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6a2cc86d-ecf5-477c-ab61-9cb117823214 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.887636] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Created directory with path [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2312.887834] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Fetch image to [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2312.888012] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2312.888727] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fb56299-048f-4841-a023-76ba5713f816 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.895264] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-2181152c-1b27-4b13-93db-dc173fc26b63 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.904101] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5cd2734a-c291-4e58-9680-e20e6abdeabd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.933629] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-05664bc7-858e-4cb4-a958-c229dcbf2f7b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.939426] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-8e879fbf-4476-43e8-a7df-25880fb4a7bf {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2312.958495] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2313.006496] env[68571]: DEBUG oslo_vmware.rw_handles [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2313.065644] env[68571]: DEBUG oslo_vmware.rw_handles [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2313.065803] env[68571]: DEBUG oslo_vmware.rw_handles [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Closing write handle for https://esx7c2n3.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2313.069691] env[68571]: DEBUG oslo_vmware.api [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Task: {'id': task-3467794, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.043824} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2313.069921] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2313.070114] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2313.070286] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2313.070456] env[68571]: INFO nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Took 0.58 seconds to destroy the instance on the hypervisor. [ 2313.070688] env[68571]: DEBUG oslo.service.loopingcall [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2313.070894] env[68571]: DEBUG nova.compute.manager [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2313.073030] env[68571]: DEBUG nova.compute.claims [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2313.073203] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2313.073422] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2313.164445] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-33933fb4-553a-4561-b938-15f6b314b0d7 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.171647] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-11f9d80d-1be7-423a-9200-10aaf09ff1d1 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.200220] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32459a1d-1bbc-4008-b183-610c010f4da0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.206642] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a3e54ac5-7d91-49ed-89a6-a3f01fbe4bee {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.219237] env[68571]: DEBUG nova.compute.provider_tree [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2313.227481] env[68571]: DEBUG nova.scheduler.client.report [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2313.239689] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.166s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2313.240208] env[68571]: ERROR nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2313.240208] env[68571]: Faults: ['InvalidArgument'] [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Traceback (most recent call last): [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self.driver.spawn(context, instance, image_meta, [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self._fetch_image_if_missing(context, vi) [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] image_cache(vi, tmp_image_ds_loc) [ 2313.240208] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] vm_util.copy_virtual_disk( [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] session._wait_for_task(vmdk_copy_task) [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return self.wait_for_task(task_ref) [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return evt.wait() [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] result = hub.switch() [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] return self.greenlet.switch() [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2313.240564] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] self.f(*self.args, **self.kw) [ 2313.240843] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2313.240843] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] raise exceptions.translate_fault(task_info.error) [ 2313.240843] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2313.240843] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Faults: ['InvalidArgument'] [ 2313.240843] env[68571]: ERROR nova.compute.manager [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] [ 2313.240955] env[68571]: DEBUG nova.compute.utils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2313.242209] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Build of instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 was re-scheduled: A specified parameter was not correct: fileType [ 2313.242209] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2313.242572] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2313.242798] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2313.242946] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2313.243121] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2313.265360] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2313.323476] env[68571]: DEBUG nova.network.neutron [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2313.332708] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Releasing lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2313.332937] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2313.333142] env[68571]: DEBUG nova.compute.manager [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2313.416887] env[68571]: INFO nova.scheduler.client.report [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Deleted allocations for instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 [ 2313.435278] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bc174b91-e67a-477e-bd0e-b774df00d226 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 368.833s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2313.435518] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 173.112s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2313.435804] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2313.435960] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2313.436142] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2313.437916] env[68571]: INFO nova.compute.manager [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Terminating instance [ 2313.439379] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquiring lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2313.439537] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Acquired lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2313.439703] env[68571]: DEBUG nova.network.neutron [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2313.462700] env[68571]: DEBUG nova.network.neutron [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2313.529405] env[68571]: DEBUG nova.network.neutron [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2313.539583] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Releasing lock "refresh_cache-ad3a9183-0e9e-44df-b920-b8b8360a65e5" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2313.539984] env[68571]: DEBUG nova.compute.manager [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2313.540190] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2313.540682] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-5438319b-6098-4ece-82ca-34f07c967fc6 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.549293] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-09f50c93-55c7-4ee5-ac73-b3f0ee6a07c4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2313.573509] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance ad3a9183-0e9e-44df-b920-b8b8360a65e5 could not be found. [ 2313.573645] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2313.573818] env[68571]: INFO nova.compute.manager [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2313.574055] env[68571]: DEBUG oslo.service.loopingcall [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2313.574259] env[68571]: DEBUG nova.compute.manager [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2313.574353] env[68571]: DEBUG nova.network.neutron [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2313.590867] env[68571]: DEBUG nova.network.neutron [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2313.597799] env[68571]: DEBUG nova.network.neutron [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2313.605381] env[68571]: INFO nova.compute.manager [-] [instance: ad3a9183-0e9e-44df-b920-b8b8360a65e5] Took 0.03 seconds to deallocate network for instance. [ 2313.694692] env[68571]: DEBUG oslo_concurrency.lockutils [None req-af63edaa-85cb-4b94-844b-6dd7fce0a9f7 tempest-ServerShowV254Test-322398376 tempest-ServerShowV254Test-322398376-project-member] Lock "ad3a9183-0e9e-44df-b920-b8b8360a65e5" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.259s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2314.472129] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_running_deleted_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2314.472595] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 2314.472595] env[68571]: value = "domain-c8" [ 2314.472595] env[68571]: _type = "ClusterComputeResource" [ 2314.472595] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2314.473730] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-522c544b-75c2-491d-956e-49a84ce3adf4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2314.486076] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 3 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2333.521348] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rebooting_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2333.521774] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_volume_usage {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.490199] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager.update_available_resource {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2336.501917] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2336.502151] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2336.502318] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2336.502475] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Auditing locally available compute resources for cpu-1 (node: domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28) {{(pid=68571) update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:907}} [ 2336.503548] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4648a628-1138-4cb3-9aae-1c740903a3f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.512188] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-9118bb50-d4ab-4434-9af1-020f5034e751 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.527287] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1e056992-8b1a-4092-95c4-05a7e7cd5e48 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.533213] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-035796c5-ab35-4cd3-be13-c0cd60bf100a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.561652] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Hypervisor/Node resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 free_ram=180885MB free_disk=94GB free_vcpus=48 pci_devices=None {{(pid=68571) _report_hypervisor_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1106}} [ 2336.561844] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2336.562109] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2336.717891] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2336.718069] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 17530424-18ad-4713-ae56-acbe585bd5d9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2336.718206] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Instance 1799a6b4-70c1-4a96-9bf9-4e855c11039f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. {{(pid=68571) _remove_deleted_instances_allocations /opt/stack/nova/nova/compute/resource_tracker.py:1707}} [ 2336.718389] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Total usable vcpus: 48, total allocated vcpus: 3 {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1129}} [ 2336.718528] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Final resource view: name=domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 phys_ram=196590MB used_ram=896MB phys_disk=200GB used_disk=3GB total_vcpus=48 used_vcpus=3 pci_stats=[] {{(pid=68571) _report_final_resource_view /opt/stack/nova/nova/compute/resource_tracker.py:1138}} [ 2336.733610] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing inventories for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:804}} [ 2336.747177] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating ProviderTree inventory for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd from _refresh_and_get_inventory using data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) _refresh_and_get_inventory /opt/stack/nova/nova/scheduler/client/report.py:768}} [ 2336.747356] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Updating inventory in ProviderTree for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd with inventory: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:176}} [ 2336.757265] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing aggregate associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, aggregates: None {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:813}} [ 2336.772475] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Refreshing trait associations for resource provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd, traits: COMPUTE_IMAGE_TYPE_VMDK,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SAME_HOST_COLD_MIGRATE,COMPUTE_NODE {{(pid=68571) _refresh_associations /opt/stack/nova/nova/scheduler/client/report.py:825}} [ 2336.813068] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-56384b59-7e08-48f5-80dd-81b1a8273efd {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.820358] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-5a85e3b9-3ad4-43e9-ad06-12bf17b7889f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.850676] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-47814336-46e9-4c50-8f50-cd525de2364a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.857735] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ae0bcae-f460-4e6b-8d10-e3c44c4ae462 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2336.870226] env[68571]: DEBUG nova.compute.provider_tree [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2336.878344] env[68571]: DEBUG nova.scheduler.client.report [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2336.892942] env[68571]: DEBUG nova.compute.resource_tracker [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Compute_service record updated for cpu-1:domain-c8.fc996f14-c53b-4953-92e3-bdfa48f5cc28 {{(pid=68571) _update_available_resource /opt/stack/nova/nova/compute/resource_tracker.py:1067}} [ 2336.893129] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.331s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2337.276194] env[68571]: DEBUG oslo_concurrency.lockutils [None req-15a1e11e-fbe8-405d-ac0d-6f3eb234394e tempest-ServerDiskConfigTestJSON-670880793 tempest-ServerDiskConfigTestJSON-670880793-project-member] Acquiring lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2337.892210] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_rescued_instances {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2338.489453] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._instance_usage_audit {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.489247] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._heal_instance_info_cache {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2339.489517] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Starting heal instance info cache {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9911}} [ 2339.489556] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Rebuilding the list of instances to heal {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9915}} [ 2339.503906] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.504124] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.504265] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 1799a6b4-70c1-4a96-9bf9-4e855c11039f] Skipping network cache update for instance because it is Building. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9924}} [ 2339.504393] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Didn't find any instances for network info cache update. {{(pid=68571) _heal_instance_info_cache /opt/stack/nova/nova/compute/manager.py:9997}} [ 2340.500073] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._check_instance_build_time {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2343.490033] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._poll_unconfirmed_resizes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2344.452224] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_power_states {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2344.464631] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Getting list of instances from cluster (obj){ [ 2344.464631] env[68571]: value = "domain-c8" [ 2344.464631] env[68571]: _type = "ClusterComputeResource" [ 2344.464631] env[68571]: } {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2122}} [ 2344.466308] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-581e0521-409e-4174-9e0b-2c76194d4e9d {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2344.478349] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Got total of 3 instances {{(pid=68571) list_instances /opt/stack/nova/nova/virt/vmwareapi/vmops.py:2131}} [ 2344.478681] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 49f25bb6-d27f-468c-ba5d-2f5d96bb04df {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2344.479055] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 17530424-18ad-4713-ae56-acbe585bd5d9 {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2344.479385] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Triggering sync for uuid 1799a6b4-70c1-4a96-9bf9-4e855c11039f {{(pid=68571) _sync_power_states /opt/stack/nova/nova/compute/manager.py:10321}} [ 2344.479800] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2344.480188] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "17530424-18ad-4713-ae56-acbe585bd5d9" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2344.480556] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Acquiring lock "1799a6b4-70c1-4a96-9bf9-4e855c11039f" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2347.490769] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._reclaim_queued_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2347.490769] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] CONF.reclaim_instance_interval <= 0, skipping... {{(pid=68571) _reclaim_queued_deletes /opt/stack/nova/nova/compute/manager.py:10530}} [ 2350.489957] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_incomplete_migrations {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2350.489957] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances with incomplete migration {{(pid=68571) _cleanup_incomplete_migrations /opt/stack/nova/nova/compute/manager.py:11236}} [ 2355.500321] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._run_pending_deletes {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2355.500641] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Cleaning up deleted instances {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11198}} [ 2355.509381] env[68571]: DEBUG nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] There are 0 instances to clean {{(pid=68571) _run_pending_deletes /opt/stack/nova/nova/compute/manager.py:11207}} [ 2357.494546] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._sync_scheduler_instance_info {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2358.735466] env[68571]: WARNING oslo_vmware.rw_handles [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Error occurred while reading the HTTP response.: http.client.RemoteDisconnected: Remote end closed connection without response [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles Traceback (most recent call last): [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py", line 283, in close [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles self._conn.getresponse() [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles response.begin() [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 318, in begin [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles version, status, reason = self._read_status() [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles File "/usr/lib/python3.10/http/client.py", line 287, in _read_status [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles raise RemoteDisconnected("Remote end closed connection without" [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles http.client.RemoteDisconnected: Remote end closed connection without response [ 2358.735466] env[68571]: ERROR oslo_vmware.rw_handles [ 2358.736102] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Downloaded image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:258}} [ 2358.738439] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Caching image {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:638}} [ 2358.738772] env[68571]: DEBUG nova.virt.vmwareapi.vm_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Copying Virtual Disk [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk to [datastore1] vmware_temp/f8f29db3-f44e-48d6-8fc6-fb81528f98a1/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk {{(pid=68571) copy_virtual_disk /opt/stack/nova/nova/virt/vmwareapi/vm_util.py:1413}} [ 2358.739103] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualDiskManager.CopyVirtualDisk_Task with opID=oslo.vmware-9e6fd2cc-9a73-4384-a9d1-9f6590e79294 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2358.747360] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for the task: (returnval){ [ 2358.747360] env[68571]: value = "task-3467795" [ 2358.747360] env[68571]: _type = "Task" [ 2358.747360] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2358.755065] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Task: {'id': task-3467795, 'name': CopyVirtualDisk_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2359.257885] env[68571]: DEBUG oslo_vmware.exceptions [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Fault InvalidArgument not matched. {{(pid=68571) get_fault_class /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/exceptions.py:290}} [ 2359.258175] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Releasing lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2359.258729] env[68571]: ERROR nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance failed to spawn: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2359.258729] env[68571]: Faults: ['InvalidArgument'] [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Traceback (most recent call last): [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/compute/manager.py", line 2868, in _build_resources [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] yield resources [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self.driver.spawn(context, instance, image_meta, [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self._fetch_image_if_missing(context, vi) [ 2359.258729] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] image_cache(vi, tmp_image_ds_loc) [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] vm_util.copy_virtual_disk( [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] session._wait_for_task(vmdk_copy_task) [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return self.wait_for_task(task_ref) [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return evt.wait() [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] result = hub.switch() [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2359.259134] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return self.greenlet.switch() [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self.f(*self.args, **self.kw) [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] raise exceptions.translate_fault(task_info.error) [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Faults: ['InvalidArgument'] [ 2359.259488] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] [ 2359.259488] env[68571]: INFO nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Terminating instance [ 2359.260704] env[68571]: DEBUG oslo_concurrency.lockutils [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Acquired lock "[datastore1] devstack-image-cache_base/6e7bf233-3ffe-4b3b-a510-62353d0292a6/6e7bf233-3ffe-4b3b-a510-62353d0292a6.vmdk" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2359.260818] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2359.260967] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-6c671ccf-187f-49db-8777-6a09611d9944 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.263150] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2359.263311] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2359.263483] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2359.270334] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] devstack-image-cache_base {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2359.270503] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Folder [datastore1] devstack-image-cache_base created. {{(pid=68571) _create_folder_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1851}} [ 2359.271653] env[68571]: DEBUG oslo_vmware.service [-] Invoking HostDatastoreBrowser.SearchDatastore_Task with opID=oslo.vmware-aeb13a11-8d33-4b34-8d63-590d35aadc1a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.278916] env[68571]: DEBUG oslo_vmware.api [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Waiting for the task: (returnval){ [ 2359.278916] env[68571]: value = "session[52d81342-85e4-ea29-2389-62ee1f7826ca]52687ee8-214a-4473-7cc4-949a7ac3f158" [ 2359.278916] env[68571]: _type = "Task" [ 2359.278916] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2359.286112] env[68571]: DEBUG oslo_vmware.api [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Task: {'id': session[52d81342-85e4-ea29-2389-62ee1f7826ca]52687ee8-214a-4473-7cc4-949a7ac3f158, 'name': SearchDatastore_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2359.293704] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2359.353432] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2359.361850] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Releasing lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2359.362240] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2359.362428] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2359.363446] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4945b3ed-245d-4325-93b2-9573a99a5416 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.371152] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Unregistering the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1094}} [ 2359.371372] env[68571]: DEBUG oslo_vmware.service [-] Invoking VirtualMachine.UnregisterVM with opID=oslo.vmware-526cb054-cd20-4c3c-96f8-9b488a1fa68a {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.398019] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Unregistered the VM {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1097}} [ 2359.398216] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Deleting contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1107}} [ 2359.398395] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Deleting the datastore file [datastore1] 49f25bb6-d27f-468c-ba5d-2f5d96bb04df {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:211}} [ 2359.398617] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.DeleteDatastoreFile_Task with opID=oslo.vmware-428cf548-3058-4724-a520-9ee18befebe0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.404435] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for the task: (returnval){ [ 2359.404435] env[68571]: value = "task-3467797" [ 2359.404435] env[68571]: _type = "Task" [ 2359.404435] env[68571]: } to complete. {{(pid=68571) wait_for_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:397}} [ 2359.411526] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Task: {'id': task-3467797, 'name': DeleteDatastoreFile_Task} progress is 0%. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:434}} [ 2359.489821] env[68571]: DEBUG oslo_service.periodic_task [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens {{(pid=68571) run_periodic_tasks /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/periodic_task.py:210}} [ 2359.789218] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Preparing fetch location {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:633}} [ 2359.789442] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating directory with path [datastore1] vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:399}} [ 2359.789484] env[68571]: DEBUG oslo_vmware.service [-] Invoking FileManager.MakeDirectory with opID=oslo.vmware-c5efad51-deba-4c14-b619-a15afe65e490 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.800683] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Created directory with path [datastore1] vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6 {{(pid=68571) mkdir /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:404}} [ 2359.800874] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Fetch image to [datastore1] vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk {{(pid=68571) _fetch_image_if_missing /opt/stack/nova/nova/virt/vmwareapi/vmops.py:635}} [ 2359.801058] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to [datastore1] vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk on the data store datastore1 {{(pid=68571) _fetch_image_as_file /opt/stack/nova/nova/virt/vmwareapi/vmops.py:399}} [ 2359.801771] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-d6fd01a1-e228-4830-b877-2e4856e7eee0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.808222] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-45a5bfa4-42ce-4ecb-b458-f06400f69c0c {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.817383] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-dcbe5125-30ca-4e6e-aff8-834fab5ba7a4 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.848782] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-4fa4b099-dbfa-4cc5-9e12-d25a0f533f31 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.854671] env[68571]: DEBUG oslo_vmware.service [-] Invoking SessionManager.AcquireGenericServiceTicket with opID=oslo.vmware-bb88c088-e0da-488c-a94b-6e96916d7fc0 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2359.877312] env[68571]: DEBUG nova.virt.vmwareapi.images [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] [instance: 17530424-18ad-4713-ae56-acbe585bd5d9] Downloading image file data 6e7bf233-3ffe-4b3b-a510-62353d0292a6 to the data store datastore1 {{(pid=68571) fetch_image /opt/stack/nova/nova/virt/vmwareapi/images.py:245}} [ 2359.912356] env[68571]: DEBUG oslo_vmware.api [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Task: {'id': task-3467797, 'name': DeleteDatastoreFile_Task, 'duration_secs': 0.045935} completed successfully. {{(pid=68571) _poll_task /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py:444}} [ 2359.912611] env[68571]: DEBUG nova.virt.vmwareapi.ds_util [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Deleted the datastore file {{(pid=68571) file_delete /opt/stack/nova/nova/virt/vmwareapi/ds_util.py:220}} [ 2359.912743] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Deleted contents of the VM from datastore datastore1 {{(pid=68571) _destroy_instance /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1117}} [ 2359.912912] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2359.913098] env[68571]: INFO nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Took 0.55 seconds to destroy the instance on the hypervisor. [ 2359.913337] env[68571]: DEBUG oslo.service.loopingcall [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2359.914974] env[68571]: DEBUG nova.compute.manager [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2359.917269] env[68571]: DEBUG nova.compute.claims [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Aborting claim: {{(pid=68571) abort /opt/stack/nova/nova/compute/claims.py:85}} [ 2359.917434] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2359.917645] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2359.927155] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Creating HTTP connection to write to file with size = 21318656 and URL = https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) _create_write_connection /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:122}} [ 2359.987274] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Completed reading data from the image iterator. {{(pid=68571) read /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:765}} [ 2359.987461] env[68571]: DEBUG oslo_vmware.rw_handles [None req-e1593689-7a3d-466b-bcb2-aaf14c682f99 tempest-DeleteServersTestJSON-1837775584 tempest-DeleteServersTestJSON-1837775584-project-member] Closing write handle for https://esx7c2n2.openstack.eu-de-1.cloud.sap:443/folder/vmware_temp/de36716b-6411-4e2a-85db-1f6e34d25e51/6e7bf233-3ffe-4b3b-a510-62353d0292a6/tmp-sparse.vmdk?dcPath=ha-datacenter&dsName=datastore1. {{(pid=68571) close /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/rw_handles.py:281}} [ 2360.036468] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-32a5b320-e022-4165-9109-a4a399d8eb33 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.043649] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-a07002ce-a2ff-4726-b60b-47e55b1e132b {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.072386] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-172d3df9-a0e1-482c-ae51-778c17bdad05 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.079415] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-1ccc0c85-aa8d-4e3c-9562-0173461ba28f {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.092853] env[68571]: DEBUG nova.compute.provider_tree [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Inventory has not changed in ProviderTree for provider: 00d803b3-09f1-4a26-8896-bee0c6f9c5dd {{(pid=68571) update_inventory /opt/stack/nova/nova/compute/provider_tree.py:180}} [ 2360.101991] env[68571]: DEBUG nova.scheduler.client.report [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Inventory has not changed for provider 00d803b3-09f1-4a26-8896-bee0c6f9c5dd based on inventory data: {'VCPU': {'total': 48, 'reserved': 0, 'min_unit': 1, 'max_unit': 16, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 196590, 'reserved': 512, 'min_unit': 1, 'max_unit': 65530, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 400, 'reserved': 0, 'min_unit': 1, 'max_unit': 94, 'step_size': 1, 'allocation_ratio': 1.0}} {{(pid=68571) set_inventory_for_provider /opt/stack/nova/nova/scheduler/client/report.py:940}} [ 2360.115528] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.abort_instance_claim" :: held 0.198s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2360.116040] env[68571]: ERROR nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Failed to build and run instance: oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2360.116040] env[68571]: Faults: ['InvalidArgument'] [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Traceback (most recent call last): [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/compute/manager.py", line 2615, in _build_and_run_instance [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self.driver.spawn(context, instance, image_meta, [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/driver.py", line 539, in spawn [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self._vmops.spawn(context, instance, image_meta, injected_files, [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 786, in spawn [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self._fetch_image_if_missing(context, vi) [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 639, in _fetch_image_if_missing [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] image_cache(vi, tmp_image_ds_loc) [ 2360.116040] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vmops.py", line 537, in _cache_sparse_image [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] vm_util.copy_virtual_disk( [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/vm_util.py", line 1423, in copy_virtual_disk [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] session._wait_for_task(vmdk_copy_task) [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/nova/nova/virt/vmwareapi/session.py", line 157, in _wait_for_task [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return self.wait_for_task(task_ref) [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 398, in wait_for_task [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return evt.wait() [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/event.py", line 125, in wait [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] result = hub.switch() [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/eventlet/hubs/hub.py", line 313, in switch [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] return self.greenlet.switch() [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/common/loopingcall.py", line 75, in _inner [ 2360.116383] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] self.f(*self.args, **self.kw) [ 2360.116709] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/api.py", line 448, in _poll_task [ 2360.116709] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] raise exceptions.translate_fault(task_info.error) [ 2360.116709] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] oslo_vmware.exceptions.VimFaultException: A specified parameter was not correct: fileType [ 2360.116709] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Faults: ['InvalidArgument'] [ 2360.116709] env[68571]: ERROR nova.compute.manager [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] [ 2360.116843] env[68571]: DEBUG nova.compute.utils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] VimFaultException {{(pid=68571) notify_about_instance_usage /opt/stack/nova/nova/compute/utils.py:430}} [ 2360.118138] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Build of instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df was re-scheduled: A specified parameter was not correct: fileType [ 2360.118138] env[68571]: Faults: ['InvalidArgument'] {{(pid=68571) _do_build_and_run_instance /opt/stack/nova/nova/compute/manager.py:2454}} [ 2360.118510] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Unplugging VIFs for instance {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:2980}} [ 2360.118729] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2360.118875] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2360.119048] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2360.143379] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2360.203906] env[68571]: DEBUG nova.network.neutron [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2360.212950] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Releasing lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2360.213191] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Virt driver does not provide unplug_vifs method, so it is not possible determine if VIFs should be unplugged. {{(pid=68571) _cleanup_allocated_networks /opt/stack/nova/nova/compute/manager.py:3003}} [ 2360.213376] env[68571]: DEBUG nova.compute.manager [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Skipping network deallocation for instance since networking was not requested. {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2259}} [ 2360.299511] env[68571]: INFO nova.scheduler.client.report [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Deleted allocations for instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df [ 2360.319234] env[68571]: DEBUG oslo_concurrency.lockutils [None req-859c473f-9bc6-4132-8924-96c50f35f0f6 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.._locked_do_build_and_run_instance" :: held 385.663s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2360.319467] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" acquired by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: waited 189.409s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2360.319678] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:404}} [ 2360.319878] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: waited 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2360.320057] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.._clear_events" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2360.321825] env[68571]: INFO nova.compute.manager [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Terminating instance [ 2360.323296] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquiring lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:312}} [ 2360.323458] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Acquired lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:315}} [ 2360.323626] env[68571]: DEBUG nova.network.neutron [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Building network info cache for instance {{(pid=68571) _get_instance_nw_info /opt/stack/nova/nova/network/neutron.py:2010}} [ 2360.351811] env[68571]: DEBUG nova.network.neutron [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2360.409117] env[68571]: DEBUG nova.network.neutron [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2360.417644] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Releasing lock "refresh_cache-49f25bb6-d27f-468c-ba5d-2f5d96bb04df" {{(pid=68571) lock /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:333}} [ 2360.418017] env[68571]: DEBUG nova.compute.manager [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Start destroying the instance on the hypervisor. {{(pid=68571) _shutdown_instance /opt/stack/nova/nova/compute/manager.py:3124}} [ 2360.418228] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Destroying instance {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1142}} [ 2360.418733] env[68571]: DEBUG oslo_vmware.service [-] Invoking SearchIndex.FindAllByUuid with opID=oslo.vmware-57125d5c-c15a-43aa-b0dd-81c1726876f2 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.427753] env[68571]: DEBUG oslo_vmware.service [-] Invoking PropertyCollector.RetrievePropertiesEx with opID=oslo.vmware-34f6b44c-deb6-450b-9f72-bf0d2b7bc999 {{(pid=68571) request_handler /opt/stack/data/venv/lib/python3.10/site-packages/oslo_vmware/service.py:371}} [ 2360.451641] env[68571]: WARNING nova.virt.vmwareapi.vmops [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance does not exist on backend: nova.exception.InstanceNotFound: Instance 49f25bb6-d27f-468c-ba5d-2f5d96bb04df could not be found. [ 2360.451831] env[68571]: DEBUG nova.virt.vmwareapi.vmops [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance destroyed {{(pid=68571) destroy /opt/stack/nova/nova/virt/vmwareapi/vmops.py:1144}} [ 2360.452012] env[68571]: INFO nova.compute.manager [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Took 0.03 seconds to destroy the instance on the hypervisor. [ 2360.452255] env[68571]: DEBUG oslo.service.loopingcall [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.._deallocate_network_with_retries to return. {{(pid=68571) func /opt/stack/data/venv/lib/python3.10/site-packages/oslo_service/loopingcall.py:435}} [ 2360.452465] env[68571]: DEBUG nova.compute.manager [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Deallocating network for instance {{(pid=68571) _deallocate_network /opt/stack/nova/nova/compute/manager.py:2263}} [ 2360.452616] env[68571]: DEBUG nova.network.neutron [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] deallocate_for_instance() {{(pid=68571) deallocate_for_instance /opt/stack/nova/nova/network/neutron.py:1803}} [ 2360.468713] env[68571]: DEBUG nova.network.neutron [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Instance cache missing network info. {{(pid=68571) _get_preexisting_port_ids /opt/stack/nova/nova/network/neutron.py:3323}} [ 2360.476443] env[68571]: DEBUG nova.network.neutron [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Updating instance_info_cache with network_info: [] {{(pid=68571) update_instance_cache_with_nw_info /opt/stack/nova/nova/network/neutron.py:116}} [ 2360.484338] env[68571]: INFO nova.compute.manager [-] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] Took 0.03 seconds to deallocate network for instance. [ 2360.567708] env[68571]: DEBUG oslo_concurrency.lockutils [None req-bfcdb4bb-0c10-477b-83a7-38ffe08df641 tempest-ServerShowV257Test-819981183 tempest-ServerShowV257Test-819981183-project-member] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" "released" by "nova.compute.manager.ComputeManager.terminate_instance..do_terminate_instance" :: held 0.248s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}} [ 2360.568530] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" acquired by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: waited 16.089s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:409}} [ 2360.568719] env[68571]: INFO nova.compute.manager [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] [instance: 49f25bb6-d27f-468c-ba5d-2f5d96bb04df] During sync_power_state the instance has a pending task (deleting). Skip. [ 2360.568890] env[68571]: DEBUG oslo_concurrency.lockutils [None req-d72e3ffe-2cec-4850-95fc-f15438954c57 None None] Lock "49f25bb6-d27f-468c-ba5d-2f5d96bb04df" "released" by "nova.compute.manager.ComputeManager._sync_power_states.._sync..query_driver_power_state_and_sync" :: held 0.000s {{(pid=68571) inner /opt/stack/data/venv/lib/python3.10/site-packages/oslo_concurrency/lockutils.py:423}}